var/home/core/zuul-output/0000755000175000017500000000000015145103250014521 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145110050015461 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000251545715145107770020300 0ustar corecoreikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB A~"mv?_eGbuuțx{w7ݭ7֫d% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿-h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3?lm$K/$s_. WM]̍"W%`lO2-"ew@E=! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$wMm[eGc̵E$uLrk-$_{$olUI y4 A(" 뭗R=>:"nKErHc1FYbQ F;v?[uLU4lZ[xEN'oI㤛rP*jC" 6@dmHg1$Ƞh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B 1%8d`Q4d$x0t8@tmy6T\YAidtxBG:pѨyeNg4]M e}Wn6i~GځZ*FU{fXڃP'Hd4 ,ŸqMHDCYZz Qnz܂$Jp04ȴIL΃.0FiO-qy)i>_T^|S2G4miBȨHM(2hys|F 94 D廻lϒòκ5q|xC ,gKDzHR%t+E/d#礱ºȄWEz o\JξB6wLKZ39(M +(PUՇfR6#ю3Ȋt ݪbh]LT怷䀩S'qf&)-_G;"1qz0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=pOf'/wѹ8BSs艻*fzh5~Yy;,DiYTP;^6ggw+zZFD& m@WXy{[a 2tc^XS?irG#^ŲDI'H_ȶ;RJ&wj\v0_0/¬zHmmS2ҒN'=xAN\b*K ڤUy""&D@iS= 3&N+ǵtX^\cDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]|XE$F*#(G;3U-;q7Kǰfξ}?ke`~UK mtIC8P߼funl8P銗KDi'U6K೚5 .]H<$ ^D'rGD@cqm0G{ ̷hi|9Y"mmasSbb'Rv&{@6; KE.a=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'=+KgC{:/UooD8q̒vvW3%9pM&jV3=ɹvY[3iOI4Kp5 d2ﯧgd||K>R19'0LL]M[luFR9I5YpVgtuZfG{RoZr5٬r;wW:͋nqCRu1y=㊻Ij o{[|[%q0 CJVԻQg9XʐoHKFϗ;'QZg܉_ݛ/_[z]Ȓtyڼ|=* lH#=M\Ə`'%tYWm𓶝??Di륍sF,]VnSSJCҖԻq=ky^L6/R%eZ;i.p胉,y4F"3_ XC.;o;oX]}:>3Kn0R|WD\hnZ:MƏp}3ث^(jL}0EƥN7OQ.8[ʔ`,Rt:po=0-ʁקiߞOtA3)i>3Z ixUق0D Lwbw Z8uzS"\M+Nl E<^1;I32IpBiǍ?ե sZR cku ]dڙ@{@|؜J"Ҫ [焓&pMzxdysj8OZeIRkZ?8IKw y2XN2_h~Cֆ։02)=Ȓ7D- V)TVFeUk+7KIT, WeՔ}-mȱ 9"xiwH G&vg~|Y_a(YE:e4 :r9'`[awf}c/⨼0Κ_pL/q7j!dT/E n/p=t[h+s?Ny=M|o;f|l.g? I;.K*!<=+"yK5C]4uF= hQhhPsBÅV@6F \ NWLlY]K᜴=/ VΉIl4Eq gw$>n?>9|dUA";!$j玨Kx E$K3hN(OtÊ-#ӿfƏ N崛, 9g3Ǭ{&Vdӎ5W1!1KYd`,-*&Ӿ>F~/&jb.~cK5[g`!|d/.^oyh!; >|*'@x6yR>ngg 枤hˍJ_S{gskI\t`綘080ƱQŀllKX@116jao?NqUѢ)*v|oãee@7.z!<}Kj#IkXW Q:U>fQ*yEK*"_R[كrq IH!6=Ocnи%G۟"|ؔ^KПy׏<:n:r{~֍O8. ZsߴIJ>&I?L6i}Z^XpCًݽk-$pxbڲ&6*9mg>{rt?(vF/\4ZfR-dOVaz./j-yGNMOK?2_~3\z=z}NG$*A> IcuR.oӔ}w$Pz@fq xs  xh\L r Ѥo Zt(Q >~|$>tnMdэor,8.7uO`c Nc0%Ն R C%_ EV a"҅4 |T!DdǍ- .™5,V:;[g./0 +v䤗dWF >:֓[@ QPltsHtQ$J==O!;*>ohǖVa[|E7e0ϕ9Uyzg%pg/cc6RS`HFLЩ LkJu\!`0);Sak$Vfp~C%YdE6c>1ƕ (0W4Q>@>lWN"^ X5G-nm.8B>NOI[31,j2 Ce |M>8l WIf|\q4|UkC.gr`˱Lϰ} xr.~l-ɩu_Drd31V_ѺUib0/ %IYhq ҕ  O UA!wY~ -`%Űb`\mS38W1`vOF7/.C!Pu&Jm l?Q>}O+D7 P=x@`0ʿ26a>d Bqε^a'NԋsI`Yu.7v$Rt)Ag:ݙyX|HkX cU82IP qgzkX=>׻K߉J%E92' ]qҙ%rXgs+"sc9| ]>T]"JرWBΌ-zJS-~y30G@U#=h7) ^EUB Q:>9W΀çM{?`c`uRljצXr:l`T~IQg\Ѝpgu#QH! ,/3`~eB|C1Yg~ؼ/5I7w9I}qww}U~7뭱ԏ,}e7]ukDn`jSlQ7DžHa/EU^IpYWW兹Q7WyTz|nˇ _qˍ[!;n ^b k[);ng]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5YO@˒ӓdcY'HAKq^$8`b $1r Qz?ۧ1ZM/G+qYcYl YhD$kt_TId E$dS:֢̆ ?GЅ'JƖ'ZXO݇'kJՂU086\h%1GK(Yn% ']Q; Gd:!gI-XEmkF}:~0}4t3Qf5xd\hEB-} |q*ȃThLj'sQ %؇Gk`F;Sl\h)5؈x2Ld="KԦ:EVewN ًS9d#$*u>>I#lX9vW !&H2kVyKZt<cm^] bCD6b&>9VE7e4p +{&g߷2KY,`Wf1_ܑMYٚ'`ySc4ΔV`nI+ƳC6;җ2ct"*5S}t)eNqǪP@o`co ˎ<عLۀG\ 7۶+q|YRiĹ zm/bcK3;=,7}RqT vvFI O0]&5uKMf#pDTk6yi*cem:y0W|1u CWL;oG^\ X5.aRߦ[_Vs? Ž^A12JQ̛XL:OEUپOY>WK-uP0\8"M: /P4Qz~j3 .-8NJ|!N9/|a|>lX9T ҇t~T1=UF"t; 8-1I|2L+)WȱL˿ˍ-038D*0-)ZyT13`tTnm|Yhi+lQ&Z!֨řoҒ"HKX 6„=z{Ҍ5+P1;ڇ6UNE@Uo/>8.fgW]kY0Cgcu6/!_Ɩ} ' Ў3)X<seWfSv!ؒRKfs%(1Lhrٵ L.] s?I,HCԢ[b C-lLG+@_$c%* _jR|\:dc5u= A@kUc\ǔz;M>dUN/aFRĦ@x؂ǀ$6%}N^ \mQ!%8j0dUo=rh>*YȴU3Q,̸*E%59sTzɟڮ2kg ۱wEUD3uKrr&"B:p`\E)j<).R&#ÃecE,dp"nPS 44 Q8ZƈKnnJei+^z '3JDbSK;*uБ:hF ѹ @˿ޗ~7g9| hLXULi7.1-Qk%ƩJ4^=ple;u.6vQe UZAl *^Vif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%eU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl?6o AMҪ1Ez&I2Wwߎ|7.sW\zk﯊溺^TW^T\*6eqr/^T77WNZ7F_}-򲺺VWQ77V\_v?9?"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsT~$U>ceއE)BI>UljO|Ty$ŋrwOtZ7$ "i 8U 7?mc+/7 \nSĎou(%xWgdɯ5LI8eofgg类3XQ vrƓ%db$i,2'[Jd z76 զHe1T%pWU ro ㇪ OH` Q *xGybsWДc i UQJr"Zg0XŨxEYϛìH.?I!T$2T?DCQ"`-œ2R,:Hw`5yA U=n6F*3VBc^z2ȫ n2Tu{^tDS.'|x I 鉊m `'0x=Jq3\>ΣarAP%d|OwIي=풲gnr_jyo/L|m4W}Ҝp󶸲u ~lw\yo+҄WC]s=hc~ۚXj؆턁r?Ft[m 5YQMjj xM,˅cJaV-z#u 6A3Yw{<^hgw'U^d{Cn.oS7\|Cz tr?- , ̷7"qr׶]GhoIOSh>KPý[|(S 4R+-p-?}{"F瓅JRi+U@y)Om]18Ǻ[~}56T`mhdѱ_T^?` @xDƑsi_y|bO!ϏoyDoc#֛73] Uw,H~}y cXb||,?e+Krm鷺Vd2B  dq'^6 y#,:>/ō🆵A/6 5泑XS P(H<@ "Otލ=Q br1OPu֨̅_Gl7,TѼ_N*^^f`,qKb9KoNFaVQU ojX:H+5)UUu|0aFf2 /x01 I1QXW5_6QZ3K?GSQZ+#{8_P5=> -#/xtq7^ X4QKh k ?}9#] @dCSU_VG#Y^״G)A3dprX%=1ȠK#奈y\fK9J;oUs=w0MZq x9G C{T2f>1O}NۑH((46c49HAqwrVUtTS/HP\i(7LITr/݄;:8.D @]x|k+aUhrLK5vo\BLs!H:fi >dYg8)$I;X @DbQϻt+FyuôtGNiv_\確Ί+Lsg,q<;fa+Nt#?ʣ)\IPQ $T~WuNt:9K(v5G0~J l Mmm:Nд*n4DN6Iok"mWi+)D"@} _n|F$%[U9= x'98QbFo?&ɭsA(W%, M []uE3ӓc:4G$;{q5? *j+5@LXȂ"n Z.ɣ94Ohky]4iu* k+l^>9}4-ΚXȢy=EcH'RLL8{WT]0e*& Cre'Ibvi1T¾qq/" K)qq$+p1jEʠ\ XB4hSxzn0vw;/VbH 5+DEUC  זJc~tHQx`U@7'ٵ:-y(p#_@itWGYzduQh գ<%irǃ+J%Qux$Y;jJ@TʵYvux4;iЂD-iHtC$+$7{i_9M`J4Ȣ0}<)Φ^ÃI^d 7uǓALSQ@ȡ|H6OXmn <8mui3 nVyxо `[-`T[YI_ՑqG` VӉd b4,UFʻLMY*AJݖX㈬iLXc|NDn:4J)ԝ}鰎i̊.)W vKM3OEkn /zg4#ֶHx{xjY-!zc}qœ0"W O(۶2zM.Jg}#}^+~X*_2zIWuGYD ФNڭ5eK*YSO)VSn`NJK vyWJ GVճL(eO׌?o<$|5w+>M0+|[7Ylꡧ^DstC:^]=$VtS݆*){RX# ](!H^B}3.x K]#'g+kIdv]:QB1c&.R<-wrfJgΖ0*ϮbHYR0ȫ5edyl6hVZfiVTcLs:G4ºĨ<BUYtP,ylɫaH`ualaxI ):\ Lgu)ѳP<'j 8QDjkIT5" f-@?] Oӝш;vqk[Ɉ-8e2gYYa̯`,Fj`2ΞP_2LZWblu*oCєUK6g2fsüg18h4eKŠp]oUtEw)FUYY-*HĤjcqVĔ-MM=G}EnI'6ӵaH]YUմi[#X6eeˆR2 u$#ݯ T6MD-Ryv$Ns*QuDewN2 zۚ̔)"Rq/JVTEa\u_ˉb]oR!,mf',Uےr [xТA[-\5mKʽyn0],Omkۊ| \{@%_+ 7 R+LB@QdۉF(+.b,g{Z+ҷl\O8ɛ'C)!d6XoЈ".A4*)WJF0wq 3w"lZP Cd탚9fl˖/><)KCpɛvmY)%FTk>l6t{iDDK/,{wl%cIXqX|>O2 e[S6%ь$= u{uJmHF1C.FbЏ:gΣG|m[KES`?ϕ9Vۊ9-aX뚶u!w iiCnk0 wХL,ǎ쮼JCemC(/pJ-W2iN~YsK¯In@GaYҹ5d˶}ʾMlnR4]EHґ8Mز`EPnZnQM[4tʶA*L'sVڔ-,R˟55|֣I+Rhߡwd4 Ȃe@n䏧m>BՅO>ڄF\}w^0~Ó<}6+RPT {6 $i Qσ4 ay]4iKҠ{@n슶[pqyejx_!ݽk[H}rEhf7oi.Yum'!%v6cI$ïgFLAn;Opy}I4ml&D[W=f :J`49ͻ<1PO&ʤ*n}^=oYO0ǻYZ]zztcn+XG#N3!&PC1t7LQ ._!5%)gzLNJmZIT$<_ph=zԢ6"t8*l?"T<~&VUۤ"Z '}M[YrWpYv">;:O>opxN_ o<ۑ4tf* ˴,- 0߄A}@EGJjO v@y_ ϗmFb 0`gbЖ)0X(xPۍmd"(x# " A(%Yݠt Nm >@ΞQюs }j9_k ZMF}w  EV18h-838h9#4hNZQ9m28h/Z0pmՍv7 LP `p _ȅ @l.Ls^;vGZ/tdD@n(CHA7|[ w&@8Z` 6 8=qbyjdѺ:kEv݀v gqr -ǜ yqZ[yӼ:Aァ gֱ]Py<>ЕMt%; ǣWK4Jư҈4a,dn>9/Shk ִp&zDxCƛ8B{pY@m&ªvL;-,YVP>t.{jԏz.CзmihZ ^G&!o"iQX;U[P!5ZB_X)){4TkeZaGfx8Ns00"Sz r ,I'z>`z h?WE.(z<7tʿʾd$3 ڬ;}d!s?onT;XGL.m^Iحx&'ǯast{h{nEQ y,aLx Io7IěAz=*7mꁲfcڽdHPbS'sȋӘKC&0"cCw=zgQ/S$&S7w?(ŧh!hvI/K+,`96 Dih$ ƺ τrY]wa߁OUf?kmBL (XV,iQsVS~37d`Gw =ZWM٫&{z]G8?3rXJAk;NQoe\leRCtUkX&Ն0-kz=RM:)<1-jQ8g/YT1v1`aƺ! LI/u87UA((Fе*)ݡR3kZtz]حjv}n\.[m!,^VOPwGA-V^Pyz; m!*QP AUA- 4x]%sU%T9S"yXUE=Mj/r_y6]=XLde^6_9r<=GrЈ7D7,B1 MR߇Mg\}v3/>IceӴPϟA`VdeJ~Φr)p/9y2ѳklEA}=<=nEJ._^1Uڙm4\̳<x{b>T6  ,\?E&yZd92t,Vq]-4EPWO: !n.r:`x~VևNgp7=p0{x|x$_1H>=dB>$-(EpI}po\[8WV8x$E*[/Yb~xZB?i+-pr0 {AZHH=_eV_-[V_DcS.էMq o}sPQCWEagXtRu̪4KʟI Ԙ &/7 .= `| 캪6^ӗg‚88޼*\'VQfpƉ"AdD1l5:cJI I@{o 2ثrZ.E냇UP6>Mro%*ҫ rX8t`a:EI"F+%O3K'fӞ}Y"F;9x<\3hkZ!pYIy7RBXi@g+ K |"U(b}"! vGl߰j~H+RRԋ0)4_Kd.g!(lCUD8:~h$ w )ٲZK^'`rѾCuklD'dŃ xIWhB딂A(X"ŁI W i ,?HS [;$#EuC2 W.?!yVdW {XTA:z?Vǧ`k~}7ҁJ|e#j1]V).'wԧ\N)sr+>ڀ"!Nf=R?N^oJd9G{/C ioV5z]ӷrKU7"f1뼜BgcHQiXceF@ʸhD$6*3CyfX5'-["ǣ41ZxH36ApwkQWn܇+o5AA3ex1W݆tH$T3<ȓ^U* m2T X4}t%lr&k:Œ~ΏIl1NkrP>̥:E@}#[$]: v w[İx{Nx3C]-Nh;[G:!Ǣ?lze)J+*G>$W1,rVn JUOdI2hS*IV6m}l!HKu.a6Nc(/ĩ5`Tn훂izszawj" ?4΍ 0:%X_E:#bYW04ENxh* =Y@V3'Orwsw/u%Owkz6黪3sVn:{_gq1G?'vtDj쯵g^ux;mBn΃2̺/poyy|zoADw]f]<ߤ Bw>"+mkXUU_FjN̊mA;̀XboJ,N>\??4`4Nb)8Ej&1Q1aYbՆ^]QC7,*Q|rQ0-yɢI՚;K˳xY ˡ8ZC Z1dze2KY>-K }x!q?Gg(VySe 7š`Pw1F?@> LpHSplj,Jޘn+MﺗצO3a%-=z8#rJ*Vf{M RGK+ G71ȅM*,z]Y_c. cieG=֥m/͔-,W& 73o07PJm4f+@e^Ĕ6p+XbqK3`N kn0KpQUcA`Ң>0dp $mB`?:,&) pG r4ԆK&豩/ VI3fIOA7!dt q{1fa H WX 4T 0w7|m_ algZ'B÷ܡ R\B" (W3Tb,-NLK}\2'ΰgAHTb̰?n@CZxSxP /DY׌P3XcZB`LO߇XJ^7!ߗ,La,-.Wr}E|a DUSϠX>`NDċ \h/Dfhêʞ쒷s1CƐqNkI$kӶO `XT%-6rN@]ut`gk8R*. ª"C!96w1@jr XFZju r/d(=ޔ XZ2hrI eŵR L"|NtCԑqL0ÞkJUQY+E7 bզ%А`+K6'<:RY]%V5)q5)q%ڑ'lػPV4r'Fa}_4?@"э3)BFT!29R%x-`̐gMzSEc%rzB~ۊ-ۤ=&1æ6GCp F*9TFt\:U ?6CY-O֖SGR1]Lї"= a [kyt%evAwYN2Q.ׇODXbvӫsyeq{H0:VaD{ԝҩܥhv֢ L`YB9T Ƙ95y[g4ΤlbxŔ5̓T\ "yOR`w|˚S]/~#xj{kY:JU#2ޘU8vv׷F֕>\bT3X"Z(YL= kNNkHХ< ?dByvAWUxSm}:0xAEsLʕпy/nuHIˊTBXZ<+n=YxS'=5υ@",foHR KҬ[S^xŒ8cTc@z7@]Q܊UuCLBU/Ipꀛ1[SŎ l?|ٓ8ة6̻9 xY }֪fdZrp.6JsM1B"]sywbpHDMAEpODB|ZܪKmbA*a,-ޑ8Xx8'"f{S9n2Gciq|%qp]if[eWYu?OYxjp^SfȫE ňHk-'3a~ $O-7h9zeW"Z-P+[>x |etKq{?I?59' \.O Gid ̐v-܂E[h F֩:Uo_dFE<2b1$-ilF7Hs~U"Ǹ%@/bASxk?=<&0J8m ^;#xӾ)aEk?8n_3EWQjnjb6%y~U̎Va Mf i4hNkkq9uͽbujE+Cʕb,-^eSW{`z9느a5ĉY/2f9R=Di糀y`jmL۪-d&RnNxC`hCZU1 D1geF=(DbQzԀqTp1Ȫ:BtN1n$8n?5?_ اiϲ8mpKH?B6dG)qO⫍ܒ'[ } wʈD\41Rs86U?ITeCrb9;ES'Yu:$枦d6hmR`J}[N#xC֘.RmVhKw(PNZ 8?ѲrCj!W{K N eH033C$ $=ΝXyq|`%S9yL@Ϳ?d[>ek))X4٠V"̐jժfg5ּT'w>` @x 4AX]Uѩf܈SnU:A37 xsK=ko`Utq I#S2YXbۯ1$om[ܟZEyF۸nﯮIpl7 ~9*68P9e5Y!z߳vxpVx d0bU曻)XId F.갗gXOSgfh;ੋޖb]Z@5 Dډ7#ӘxUɷ]!ыUUg1&X{Mc૒o@~-;êʠD%tGttjl3ؘ#Q 5 2:mzsnwG,j,ـ/sx"ќ%0Z 6@ ,iE)?ä_AVȕv?M@ê:0 E(f٧J([5M亷[\xǂpLx .آj(bc3ԤrW8cr#; [TS*Vzk-8TŘhnˀLVvOX>x3^jea8#(EnKGX"-BY>XT^sG 'd)oYJ W ^7JeWn.[Xxw=V֑;u{KxSѲr'&ig;O0Iڃ{&֝sٔ:aY 3lAMJoWU:>˞ cq}|&8 .nl"X#Wa +nWUKa ˎ- [oG&[cmpp5T$|NTU͕eG_!%-+Ya|ow\w&vIleIv#a#_wlfU=p[dNBDffn/𝑑 ҂~JגGF*5Rq˘DVzU!:l?}9TV5N]K+Lgx,}qua,-9/5@`BsyG&YƙR}v""CdC#,-]nw$8 *l.7$3xu2=nShWXw=n r! 4SlOk"&FJ˝jǁي(4>!3ڥhvYNIo_'13FJ*4Iu7|WD_'Ww~[|Y3X(8m$) N ]&_0DU-Ib9_հ̳ǗOU֝(ƧƴոKT$ػ6n,UX5dFTd\xUv?)@[HI%:LRMJlIBJd7ޅ@)saW>6~YYAc>jkg q)֦6\SC;@0!46&5n^WifSx6jܿ^iG&VfQ={NrìDEA~nr x\m~9[P\˜>݊s 1͊rXV޵J9-qCD@o/?L:چIwcOSL~!9& /CӬ9OZ") NR|$3kZ*9}vgCϿq9>6tx: :LQ_!K h8-a6SoZϳh~.ٹe?>(7y? >U$C؋5++r"Ļlp~WҬw0:,hr'aa•Q R!AYM~o10UϾ{(=Xe5p8r#>l@_};(G諉FA3$qFF@Qp>`=^(<4D[0\<<5a$w[B] FjhV U`_p~I\{q8<0(ơȚ/N==t>!C츌LaAh3P3Hz/9z7+P5@Z,m϶E˲dM|;r[-{ej!+;-_`^|2yPp4Oέ  x'ep1Zu/ɢOY?h?AqbZxp1= F]EVtO%J^_poڱ\QA-!7JUI)-## D}hul2 c5b2:_X0bOUecwG @HhP8-kF>_B:0IH"jG=Q̵YpX&pˉ1+ϨC"Rr+5wNђ6ov J1,6;˃yZ'\Q(F` M>{ !V#ajûG1vYG-E=m`a_Ӗvs팯CC[QP]v}*dW&j4Р®]kλOQMu)d}wb=v *hř8i- Wˆ|Yj8.㦂ϙXVoi/edK>=G;)wG3?vֈo ̀S;tBoؕy \}N ;'@e׻g/%{uoȧn8>|$CoT.O7"YP)d p9h)փnEr3_&S{F :ɧnp|uNh'dfXD#㛎/| =&@dfN΢> F[ewL=hՍb8 )S8օ3U;lj/ Sڥ;rraÿ li2x ES\)I7( LIq7vHڥ-Jɵ g-fo 7'W,:UpNA5(s?M1AIE(CZ57G# $HUNYa Nsa코('9RjTMQHBxwяP(ntζZd \N,_g}eruw yq>d,I%b@TjøQ"AD-Hn!ªXY2vJ2v~wɐeT2b<7K2Uy2>qjO)z+ѐiөRF) ŻadY>(ӈH0Q+Zj:؉qDt1T1vBD |h?~kX\c/A/ WճK35M394A/, m "6# P/&$l[NB`k0ZXقuu@>Kw\(`. |isLg`A51~4G|v6 ۋq%Ўg#ՃXWW1d)݌!*f81'k@;wtulWoJF@Ѫl EݚlrRmeɵPݿ\'.GaX8Q䉧MLlM9 !keUU}[~jB_ }f4_햾/.^ƆB`JgCeM+0 kftq/>XپI%Y`xJfMrÞ\&+{)5t!.f S2)NqF4)TKy[gTx_|afa8+=A t`9ƎB2˲Bx?3{Xy=3ּ^|{=ѻ~~rax~텕>@&`ώ5|mޏ23TDP|1:o7|n&'i`dk'! BgGbRjF^%Z}vހ7a{u"=.}}$ :0Sgoxr w@>pFf!ڐI͂0 q*utf=ς$%F~P*6o J%JB%ha!t og5g3UPe} l<0C l}*#4kZgk3T{,֐7ɮqliF~_p9T'CQzhDpyA O46Txyžc/Na84mZ^HM_y|PG_ mEEbp-2CD>lxWؑ œIEii!?15R nQVcT9rFדT]-F"'@bTH3D(B!3ugT}G$ZCBKTڈ[Zh7`'#P%;-`&Vcaɨ?{W6忊`l(}1 d$6 Ehk-KnQr)>ddFUzwzEbTPo5cr8a>" fQQ3OeUHT+cqXA Q#Jb DfXyǚ ,&Ԧ$Fljth%S#j {΅ 4fH!qL1f&a=&IJK@f=i}.T6db jL+_mG} }2mУ Z7V.N,a%BBL=@j/Oh\Kөc)(f|B0kdt.ǐ<"LH(eD(>V|+#9$ߦD`*BƍScD!X:+}NRZ1ۇ|+)g&aj𔽐:c48K1gRcpSPX̝yHPY :8{ ;T`Q#|*8x.sd1IF5o@Rfd& pa[پ&劈^QWL/%]}JKFlS{C-569^ @0JHL?dq]Kjg^a5 i8Q#!F E}ZD "/q!Knt,M#e _.&0i{NYUӹ0x^ಎ}&h3<2MHV̩3I%Y:T8יc9͙VT3&M40i"!ƀFCe?h$p3ex,PQsk1? |\eXlUeBTmh9S2f>IM8p!M`vv@TAF%:CܮOi:4hԠkՠa4V|>: c ^#F3%n_\ L꧳ $D3k3K0n?v@H:pg5 xD;xG J鈛"@ Pkf,se[m#nZ;ic5]4K S}?0n 0/1f,S =jT8ǸYcǭlj!I>~q(~Q43x6ʛΠS-q|yy7aV'¢"HTq|9Mw=7jr}X%',B5[prdua=qa,5 !\mIꡕt )$Ԏ8rEfZsCj#%;\vE`;NiWȾHfq„5 Q戫Y TJSxloo:*8sp§\mΨYVNYFR)<}! u 7f7-&g٠F|`-];Od~00T- ΃@OBI8U$@i?dN.wTG9BҩP7$tl4Ũ~< dGG1wN TfEdߴki/~uϿG1rpsEPḧ'ӳZq'˲`BPbps(xO)VsFOOdͶ4x o}(8uzn Ǝu3L) <0d6(_ZMh:Owf:ໜ]5[fZkw=QAgEW8<=pbM[_бA⦉G9[e` ?5>]VAS)/9cV+{X0%×dJI6y@0BEg4JeE9W+WbGO' 3GA]b#|ܻ<4SpUx/'`F?Zͩ7k>K0)Inrspha4Tb~8]2tl>-̥;z&:jAϭgy6eTޚE,tE/xXfX{ib&b9M:}Td|LLxfRAfS+HspIř%siP7>YjSSwhv\;fG+֕}6@x5>+=?@Jn\E{>]!0\(ѨKx{nq[_Nh1K&e|oǏo(jC`N&A~vFAϺrx1vom\4R4ٸ XDnFs̛O6>rP`C4:-YXx./`=ͮuô>1Z~_,ޝ|; `t4<a"/ ֙ùmQd CSiJ&.!e,C2i;)kH&x2s)"fUcM[{%!e4%nk#{-WZ6U-/>D0"(};foXP 5}tfW)|׏( J-_4b응YX!Lry[3"dDnM%`ElkӔV̍ǽu|Lݶ~,&5[Nt1MT-H+Y I跤28RbERMn.~sߒ j*;ע:HGV(5 Ay3-ߔM2rYXV0-wÖr~+WX *'㤈9^ɫ5T")fk=>?ٻT@y* 1OjeE&?Q kaU!3M `|Ac_-'ic|vYI ښVχ}0l6OC;Y[̗IGP:e[U>U`ybOZ!TϦߐ>hnU)zPb5Q^ry_Z:-\E&T w;Hwagߎj*XΓik8;xps-h2fww7pZ-JAT_>^a`D>U|/tΦ3ȓtuHZ4%'/x5F~|1z:Z\#Fd*HX$:6<[dlWox)*vGFBd6:K/ك>dŐ%Ive9V .I Fv64g0sA ̄c LlvX~wWi!%kv {sc`mԳ5eBA~1%+s Z! Zu*-.aW\uwޅ>-#A1$XΧyo*eK0Xht9Mcaz:5ԫ|Pֿ}:,3/t4TdHb(cʼn`i]ə[:Z'dk՚Ly8q$UuiV.Kpt+25>bfԋVZ~.om6ؚl%MTg$"JaNz˄mV w-PKY|y_cbQ4P<)<`~h ?7U vs£ErpΗ;4qMp S?"w6fQ*;k#>/V`䀛o:n0-0ȻtёSic>$2藥A3}s_H,ŐD +?wLsJ"Xb=f wdǸC<j6Ê#ToY/ojvvfnݽSWރM(oPKi kƮ-|뚸p˚[`h6yfK(]z1Pu?U^Ż?Å(q 3RBOc@?Ty1u|23`h<8o-*$hU~:t/ S_ʽ[gF@P4JD W-59G9BGrX My|uDlVP2͞ #ʇ$@v:b ʼn =L' V "$d1äEc(聍 0qf_1%<ADFH0`L9|3}gXeǼ4ν/xkc} B K$0J;1`zk`\u!ƴA eg[(Ḱ L@lT,ɑc9FGrH9#cxm `/YrU 'Cу'>r>Dj;#`]g@ "N:JsTy2qdXt`\H-ͧ#I9)xOd XZb';h'(0`8=0>jo{p6KI ~T5'%p0(J`ޔDWc9FGrHS76ӡnTy˶j2t%y(̗ti8K HzJbpʼnUӖ4 |YQK6Ze`I1M{Ǣky3OR<(S5,%$yy0`@ʼnU ɻpY iDzl1VN{xHO|G6F|,={PD-bi&e r3L|-a\6Ӂ8U+ 'Q2';Bi߷D\{VHYi퇻9'Q8iucQ#AV|:ޱ,`<¨!9G8+vS)(+07#g>GYi9hl=*,aFՌeH!`ŴA m'l@3Z29OX8xGKtӨ*J|$X N bhԤ&<%LYF}ƀ Z_\*BJ'eV$QdN+~TD>(NH32{&Z43[,\7hxW_'2{J<2Hɉ>C~!J|$䆹 OޝgPtE#ZL!rbuCeb3}u6BWF}r/?5zYO:&x3`BnvjSҴIvqG#>tYqκ'c:㦳whok0Q'h9P&}qN} ;ɺ\ 'nƛS/A4X'{n~2% D-vp5G=Vai|ym6D*.IP2]k6^:(;:FT&ccٴ[Qlf>RHg`!pH5=E#o96iuN|_kၴh;;G_>DS cg𢓒5'G4eHa[l:üV1FS}\W H$|WNSXݽC^5Hx,Had@#':{(3jZ&(Awg?}_=Rb`Z7?ú%bFHDN8BG'%>r$FEdAǃ$omZ IbIÂk0R:Ft $ Nr,$ ӮoAGB[?kZ,iwSD \i@n3T7¹>gğ1de\@:ځѧ(Oո_lTVXjpUvH.' cܛBw#y`DZЦ(Ajl\ ̣W,r* Ѵ?sǢM3# s!A|^̱"Åi'2\2ґ#.Xヷ)q{`~`=re`G*' 6m!30MwtrB|ع $.Wd%8ܿ*g1Aْ}tV BG*A\n@ Σ*IDF:*XAN9>Cʳ #|uSό X]ۣ]5aՄ;L >X0JL=Dl֚X 6CRoc OtwZXv xADh :¨P. ='GQXa C-oYiT>hqj~zJXEITL}L97Gj<~&k\2e a:06E1'ufX'ܸ?E)qZ2˟#[<3)\ `KK[_>:6BG߷ <_uɤDZ&9/=I9wiwnJAO)([1QtD 7<6)1?$Ec,̜j)2FwuAy("v¤_UxwDx LlK*%,YF0jc\c.R],ȝ-yPÎ{݄vZ u%lZ80HJ2}+Lu$ kae]2_Q G~eՀ(SGM0䒖$$]SdXGw}tGQ t@\PH3}4{+n*"0` ,ܚ<a=miA֞E1\·#vˊ)<ȀՀR5}4^8\pCMmm :PCDZ+X% >YӸ?gǂɓ< SD `Ώ;Y18jg\Uv:;Y(OwP-?!p-z>!T.]&b vߏ4.9sL{6('[4d 'zPyJ`B>H۵ʿpQNkUĔGSE\~?Tlj/6Udlj}) oѝ'܂$RsDin =\6a/ n_kNcDűW6=+е1{˃$ڂE!S0\!,*p]LӾA_V> 6(p|H:lcZM Coa&K_\Lz&'L.+,;!"+P*NN8sl1P-aI4Nʣ8g8ܚG ܑ]tH;.X~8WfuZ} V3$5H3nD/(K1_|`x?Ȟs?lR:$M\v+gD UB!rrUHztcNm4{L >FI0O3->\~ .J|,L;]}s:pԖj’$[׽)! )I֏*} QY[ [R]t!DXL3dD2XzzZb։8SSE)+~4 nyc|֟]j4amΗvRpRsEXyWHPNM#UF|, py}ŧz[C8?_uxI1ls`f:_s,!(*ygJϜ9}}1F`6xH2M@røJli͖kj}㷥f5!6/Y@R&SC oW?R변7=)S?>E1v;>zCsZgWj8PN~?V~~ݏQF[&vBbYUûH]tWV׋#>E?c9_Qht|?չaX9\k&)߽go`-Y@e2 :0(󬈐K$RXʛ2~6ݷt?X/>?뼼-zlg0@V i[f ^_o^GC w;O'z_1YvS " 2)QUNlVUu`Zȶd-Jۛ:I1͏\ʉ"瓷G?@D]OrXŇӰț[>< }_Oۼ涶/o:j_|*6l]kʛ_u'c,U?hn/vWm};n@JjWW/rxrm<<)fǫ|V-Lz٭KZO__~?][G 6?͏5@+ бoFˆpv6V,2&[ V|^'|؍o ,R<' X>P|2a,j]M: _i6)S8)sR2E86(BE#~&I]:>}A$|T,f\p͸%%W0Ve-*:({WC~5qԠHzJjr &k8w_SGorsWKOѸ[xz_굋s2M+WnEvUI.PyA.ܗE|^E~X^Ԫ|'c &cV{iqHɗ˙ͪ`7u)mʼwˈut"UE!e>+PGW՗QCɹ,S$˕ 塚$O`E8i'CQ'C߻=f46fu ̼aj_-M1&#t$q+ᆪcĵ1{%N0zPZo`:%q$!xmuoäqz$#j1#"o2)Qqq!.uWެy=*gjTլI=jȠTSeJDP4Hg&DR zu<2b"w{\ 3S1;R#n̨j#o)J@QT(b-Ls,M8=x`\]ŒQuG5 l0?†Bf:fϑlFq8^(\?`rG%E6&^>T.?whOV!UD3EHVg2<1F}\r1o5_>>}-<Ƽ5:JC%%Upi$qDMai8%i[r%Ǒ߻*^ÒK&;ςl8|) =#-gy"7Qǘ=Ɉ[ިT-߄%e^_]* n#7$(;yU@?}Uap9Bk \l⥽".-2 QRMCT6cuE~x ٴDoZסqMdž;B'qqLztܑ'&vf c&k$ 3R ^({:!JcǨGz B' 2HI0}Y@惩D\3o (9vׁd=E~YMjtb9d=61 ;œ dbNg)-_V]8 oS8l܄sVIn[]>[y(|i Ej 90QJFxYKKS@31k,э(sMPkQΚ-(nvX;+|߅ҵ+fvUwLIH&A~lfu4R߿4kuZx|HhHo.eH2H<#<91JpQ VixYe! D[9+|y(Z߅}SNVsso@ޮtyUQx&qB| 3je95|&^t!X؎I>? ^v-xaw&S ~Cr6E,/{.:K lC\fC!,m 6WirGqzS=v.]r !BKh C1u7Fڭwj&bă!nkwROBgJYQo!C"ޙ48v ewJ3$\;R21x;$wRvyv *r0ͧMv .'t7#Û< *(J[GRPiÕCujR 4xzZ}kVA9!n\xD?57$yjC=8aOTCDNu^Kۜ9ې;IwVSq *e,KU`cUk-M*u5U$KMn3AjE6mzb奢9w 1j *MPTh͞x2ztj"Ƞޙqȷ9Es%FVnWzL,zIu̧TK Ie4ЌSBSar<8aFbF C|n+>¢v6i 5@ƅύ2fAU>UjUJn. CԞhF|,SEGۡÀzHӈ7q'7$J8sr3_H4"x)pu@5ow??+7K{ӏ'U q.r)N0_})5vL XEg|fT̙ԥx O"d3LJX۞b΄΄!P|C2I [.`1 cU"[  DN='!M qߡ{i p1WTïP/^-Qؔj*;Ld5^.n.5OcʑD~~/k[_Dl"8ahB '8_lZ^u. /_$ly/`S̖N4}nPNӄb+%A9K^#R4\_۲;CYExgF9lXJs]ȆptHx,ZHXf ؽ9I8U4ϕʝ嘬L3Lk/Afu"Γ}%"9,l{sɪ z1{K'(EzAd6T{+8T ɒT` |jSsX/qE (:aBge(( P%RAUO {4PLsDC`i1WP8 D :r׆J0OZԖc|)\2.bޭ\E/PtDC. cu4I$)=N~?A$jXKZN:Pr 22_8a GyЪN;KJslWG$!.vC,Ax.26Hp\c^h/=yEs=lDl ^|N/*&TtXBM@a{ Qsj0hoEI)szԫMfSc7vClr{e=1ݿ w{vF0(8O z-2If@wZ+|`x+D 1RHQK CI2Pi%s- )\yaMTԳ0+-as.3{X&!,Es':C(JD2AY1q/Z䕖1D5)x2n`ޗbK$!k;A|s0xŸ8pJ!$3J81E;] Ɛ)FdU P7'ª V __XSD[Յͅ+vT5ҍz>d9q4)Y7E;ۯQ~\2[<bn7eɼyo0oʟ"XI|rT J%lؙ-)d>k0ENMz]Y2m)YfaJ }p7y[-޽3Q/TZrW<FԸhno`d\x}qb҃!2`ځxyz**CUA`&! ) ~UO?7 ura Y+TvvG 7 WrBҽ[oS'=1Dc4F r-{eNhe,캶 _=#TD1%"FG6M"G{61[G)VG3mߩ-՛iqH6 SG9"$ ;Ap^</8c]rΙ 1Br!J*?nv-^}똴Q7|qf*t"*hp'yF~{gNRLǕ>`Yn-f#eږ}Yσ$N)-EQ}+a{{>B07.WNJ)xH ʙ-+c VdբZM!մ_믾0=\Y={NlN; gs-+)=/ *%q)+|#!x|sk" PwOK~i ?Ç!2sa\b\l2n%7Fx&atQfJ$#B4 `WEn+)J{K A cvl$ FE6`^ɯxT:9rc0:n*[x\KmdKa&bxZF#$L)MQpE1ZbA{r_ (ď @jG:M NDÎɥɜwBZ|ƣ5ԫhZDhZ 1Ŭ `h)6|ݜq$ԢEjܛ`O&tl`?zF1.^Ƒ|RU9|pÇm?8|R 3jT()(kċhJe<~Oe?7}nZ5<~4Ay5_/~;nRo#(%0k˸=Mt0|ٴe;\(O oF ZώBw4u>/q;,c)<9!S;&Շ@sJL̆:Zr`W-Tf ^"/*T9o}p_D͢#(>*6cEKEn-qKui2a4)هdў}jGʋFE4=z40[}0+hw[M]gnOngLPƉj+Wq/n\/_`qhfa;V)h@Q'~pPe^cNcz߿M'+׋Oj~~}?IwOt,CzGש &,v -^0f ̜j&sʸOUȍpmG-퀀>㾛kYf5ݷas77O/v_t^8S  [4^Ѕۅ"]YU >u2YaNCC@`6υe9+ߜwJ/Ps6eF8gcJQY"gBh/F/s`3v1zь+χʇy-Pޞ @tҀ>VƎ{V *lٴْ):g33FKuoUKm(`*2Z_ j{;u ʧq{d͹:̷`W;Wn[OAa"KF> o%M3wDƲcO,{.A >9 ;w[\s{C[1S-d `+1I©YN^5{Y.hδo|3@U$zq_ B#q̚RZ58)Pf}D[a΂$;_!ҜoZ%8XMe= ?ZOA8s9YDӨj4.Q5dC{Wv2gɃ3 R(r${`2>)ѱփvgּwZ,Ca01Pjʉ[|@%c䃦-&?U ݑzG拦g׹*1 sY&K:*B Ad-w .,s {S * V#DFP`}&pUaVo2MbJSWA͋=}\9gpq5 qe27;xG/7$VD8C 6rG5X]\5*~.dLYdц11ԖgݍvDžcYO"Ϳ\`ʼj Ixj¯o Ry^3Ce0\H N]޽dYqYN3MC]~Y y|O)\xD;E+ţ2Rq_zTEQ@-$e Fyx,ṷsFB\?{W۸!8`*(p-p"č#{l9~GJ);hX2{"-M!AָqA H&}<TF ̎=8)Bx͞o_fΟ=`Dz\l{TT# Fh5L$^qH|`= >H15ئEi&WetЈ eVOYq7%qyxa+jSpxm@4" Rm ɯƌ2?\?[7 쫹{G(uJ, J-ʵ嬔!3!'74GfE;1~\Pr|drzȕl.SE=qgyvyTݎwE 22T ~hZԔ)܀ X*qPbM,xU:܇f7M2j68G) p=-rKS4}NdjGA#.84T,{[!"Sm8ϸ3TsHA#282JiŞlT\X]e,ˤ(Qe,A#28!G^\8jJYs).#E^R;($di ͷ0V4"CÇ^ч'F\pdpR8}&#JG90T`mm9$3 d6$24|׮<^"2˓'Y!aaeM ł=]\)UF4ߚOv/-$NK@HfAGA% C`9?2.B1뗸6cpA2Ft1⼾I\>e^ !IE9jl0 4QcE!yȕY5aOpq mL 10$,$%\U*`!'9Ng o*~7=hq Xi,Ӣp;!yݒxE#.8,_ێ!͈#D*ETxJ*.Zj~W=pRqIb)"DiLгGltK4MQ"#O6=Īۼ;:8*lK7en!c\(DƆF\((Q˥ੵZ#2O _L3+L,߂  N$'8G`B]իB;E2E,qH4l:ͭՑY9T =[ԣ}N72pBݩHK^":K D!)ʢЂ,Iąb|*GU#Qbl4h HP̹ηzhD'Kg6Z]8>:Ln!=d(atrgg铦l`*n꠶4&#Rqe<s *(#CbdIc7cآ/Ά*'A#28÷쥿 [YJ۩렶Qƶ{ȸye,'ZQVd2@Ei^.ayXCT\4N>⳥6" @{eSW,~y}-b{Y!4CQ~,peDd5둱A#28F3AꠂKp!LܷP fmDކԻpqTOTs9CV߷`I1qIFdpxoq@,BNOF4₃$eG5B8EJ=`")?fmہh<:NjCe_w{^|2ld5^;`LYsR(A#.8i6G P9@z҃VI7M{BDAT`@bFs&tV ]3{E#*8(˳crW422W$x%'G90ԙ'BY 趚LOY04tBhCcJᾇU}FdpԖ\ Ҫӫ*q!!! 4C]8H+;d^hcH^"y X&l4=M 2,rr^53Q8-e"C庸seVRZ]6g*݅}E#j*sXѤzeeجGdOdQKX5zr_%<5O:ЗVO^[ՀPKϞE2̇o۪zp0^\ ξLsdWpgf;᷆x|5YM:uDRS7zT)^}4do$of3N jqs%܏3غM:,f2Ѯt\^&?Z_wx}x6Ccy'ƶkp*K+j )Sp>x@*-DP z~To(%U]R=DUνOb@mgf߯?j$4`w74EȂF3JNIK[EpiQ?KJ9O؁ivT; l5'(C,PC-|ܾϱČIXWX" ?&4n2fmɜW~ƿgA]md#w|ȟ#dJ\EQ'9:D+ d{k2f.ƙr γPO*>0rSz)T:0̶c JڻOc0_\w\c鸟 c2*[0VMS}N*.ev8gaӎ ۉL5Vff#w?!":0G0nE*.Z!J=W!׿䤊\圧4Et.4n"؉#*Z^V0U~R9)A#28G({ caJTe TBX 8@c7fXwO+/?#4e=*Z sC$x@Ae2t1 >讦u a S.SH Ns7 F+!÷K hDnEX mxT&S& p0LsJLnxax;hhG:P$=?OO264"!y ^4"Cp`MbU^췷ot=/O#~x ;/ȏ#ߌ-,gΜaFM(rs]S!hufWfxo+PGt u‹sv{by/4VLUʛ0oLP\-0>;zsSr?Q+vL~~D6wzڗM'cmcyu3m3Ib3ƇՉ5>cxV{wb\y\k~e`c4ʚp$%#jȥB(88.}l.)sou"+<~f? an>|sm Rbl&㉫&lk[%5{5|l]V<6H:U|é_OM;h{sRUŶMl cgb۔< 5|]o\X߹zmlgl|_zU4-zù+ zW%lO/fѝ}z4wOd^~;5-Lm^͹5( ~ݦC~V:y =K:诳J&2hx5c׸uĜ P=<@~%(!.`a뺞//.r37 ?kE*n {vRiK+P6$->@h\dy ^k4><%1oUq_Ax`(A0;L&䯀da8KZlZխq5/gz'%Y&E]%?ě| J~I6p[f&7($si8LjrB1bT!UfE΋hJErOJJ)ϦnV R~3X~*\f+ZEmۡe@,^S/v/'ِ/.̦7'~)&H)cM/,llD@41y?c p˟p? 9?}Vc7FJa!\g0BffK\XN ) ˄F] f$X QTU-}/uGꅓZ|$mӵ4>'j9'8"8);U5Jm_ cSyz_ܩ>_ [ɿxx':NDxS<)h( *R(_ML1_Iҥ '7 5QIeIe!>S)r a9BXNKr a9SKLL19~^Ҳwɣ dAFwC7o@ٻ̒&"w q>lL Ɗ켐Ÿk7{}M@F,B}J@*Kj" "h,I*I!sPo63U6߮wKJ2YڄdZJ3+>+/uCD}d&N jնnoa"37ٛcG) ZRjBezenU--TIɴLKxUTp+N*j>0)$eS)TE9c 6([\HC5]`"Q,4'%L6}⦌绣'߉sb嬬 h}\ϰok]n4N|†eʒݼ{}dn n"9C:[I婁沔/嘭Eu\Hq =6or7vl mMUbC ^MҲ چcmC󱶡ކNǘtFIIk}F˔T)UyTGqJaLM6b&o7m &jkRcҾn%h#8v@U0޷jC5lY7uud~7"#2.<DDZx׾K;N19@;=Ly+=nĿw?=xzx#}i: ]FII*rJguI)2THvl6\Dv3]m7zjo(9T``僅EB63FD%&o@RPm2/HmF uh`rvȁCAЛYX;!]H8kV'?[{5?^Kwf@-Z\tcjIэ]vخUczP %:qj7W_ Tj l[I)F@w}*!WxN\)Wōϫy5>Wzv(;uYQҸ"GUߤ"y*3Hځ&GI,R @cH'_u~Y[0"Ȼ:Eit0߂X(:k kDY#U 0J5UL{(ܳ|QgĆIS;ʺcB}ƪt@2%٤#e)lh"?MڟkkGbQYdvZ^Xu%CMVCzw"#K+|&|*m@R$l =A ~e{wJwퟭ*yjF?0i^kNGhbVfh'*fqj_/gsX';#k,6]R>R O5+0T]B+ٟ_|3__ԁJiX?(~.:u.\$?^]Ṟ騏K,I^֏1܌'iޒF]5 uGh4nǭ=#]kM+>nV|܊[q+>n/~V|܊[8V|#!Br/9DAD)a;mt I%P _[1R>D(5RꂊBQs B[5PzQo1B6,VXMæ<`(9&͈مllY9HAP/S)OՆa_yԵG$quS] "D1J٨zPTr>@[2f4w0O/Qrr&⥓ޠ4E&~Y#6?13ސ=R;,2Aj#dQ2%Jbs6VQx`S^_fz\6=Z0K)aqh8af dԚg _ދ/ v>(#Ԡhs)CrYCm=d: u ؃ jI󓚟QZZ Hgl!@yG$H8kjEM?cMNF(lg4$2ۗf81l0x'e!:g&S y e"t6Zg@DL"CUYA`/˦mEDxn򍞽JDU>XMP"!Ib{#IH){wd!d*"^h `یTkю$Pk7鼩qX63tCCv$jv2!l Q7Ȇhφϐ$"$ QF(EZ[[F` =m /Gy9^t7S~~>c%*m$-eh.1ggD2VYl> )Y Bl;q`فF{ bE Z VX x )j"%%H35E'pz94e?fo T6- ddJ|g4r 1:m/#ۋؤU %+ R"L [,!@v~| 8ٖ`WUp?''pڏo`p%- u)$#CE#*'<GgF##?0=]6w~'[|iȞ1{,٪ N2mZ9O7BDž(}v"_\Btu3~GD_~_ɽR|=UY㨅/+z<MEr/cnUsN%?Y _hEk"[ Mo7 3O ?DM.*΁2K얷gz9_K⬝v΃}YIjrE#ٔ(װDÙTUU]]_oI/|HҒ໭fX{y?>ՁtcoFlS4>^ aGEovG%S8jL:q 步2 'ca% nݜ`E8G(KhIU=&/>1g_a)bP=I 6pfȇ^r쬭# fZKj1e؈7Q5jS) ^kCX3((dQ0_2q~)Ho?tsR5XZZ5AR鄡iߝwtHnCC0vyL.ZՌ{3G4x3GԦ̕i6äJ+4 RPݢfT9Bͨ$JTCw7#h)HV_P{k4TxYlϟWw`ۖv=%._Wս/G H};mٸ,}VqRtO"ݧL}2v泛NmYsTBTuD5hcNaLeUB#Ho=UB\%yؕJ1Y佽'o=RRh!dn.E;Jid)zOj.s-O&4͸$1Z2 Dt#JkL)YVNvi%ї[x4O*ՖWQ@""eFFMS[Ģ2(h,a5u>F $JFBiҬVއPjCD(J&8R3)'G S3rXc#LJX|dsREz*t 13* !I ~&=.4R+*,SHmV=PI(x  DHD"KۃQ!UR aEFz9I?O@;1%>=aQ)D0KL Tr]T |~)?7˿$RD`! v*h95kIAm&6Vi< `^V6妃'?iDߩ"pB J\k8X+ 9g J#9!/pd1'u7Ls!@wY9[ Q[pON3mȔҴ[7UnyغwBC5"LJj33/aN(EkfF~Ze೗V=yr 8M(rnQ֑aYqGkT"'2P|*hlt=qcxo;Jn }|d}"H~/8l=)a+`? ݏUw߽fqoT7kͯhe=05L=ӗ7Q*'`1iTMї83LVQpVf9h!ǭm߿ V0HpVg!rF5tD,Z[JL-0"^}/. 39z/,Rg2DOE"k/%%\2N,s$+P]qr/ܤaa}6p؝ŗ Z'_<h}[T| y.lb oHn_#פo ;k~t7b,UVfG3J.wPz`03>{Uǣjd _7ozd"z<06omC:@QT:έ_x~>j&:XΌGéƤlxC`+lel V6RRD?jm*пjWQ¡7Tub.6U{4WTk옿qv<O&T`3 J2m'fU7<~RmqlǛUm}ksƺ{=w8vCōuT!rwtS|{7FXsvW2ìuns5J:ST]xb.fC~2䯤(?Z\Xu֢+/=~%oA"Əo^cuWr("~j 3сއ/Wxvذ !sB.OoN '(:OB db+%Ź{v챤6'0h2{  G҆tE.)#xQTycDw=1+e-ܔtyHcJ^ZS-3/բ-rfZ][5 qԢoWD׫ŦBC<iȽвV%*GQrԪ*GjVU1c<:+D^9/Qڪ1LH@x]>CTϏF:39ۧMubX)Ql=N;[B7zf!% 3BsdҼ!~CᦗW S èo>jח=)QAq̠c,j,qL.p-(;2CSJjȺZx4a%&L& M:,O:un#m 3dj߰˥$DDʸ`9*Z% AGc 1rU Q8ӍX,X-ܸ!PLpZ9fR\<r66 PqI+p4YOBX5fFd#)D#z@#""X?f3X *[9`Ť%B"4 '59k5C*,[s:So^uJj'PfP>.uQf罧Q;`;*5H `y "Q!bF T/E熺~b70"F6хἈa*Q1w`$.i%XT/L"=31+ϘfXQED'7CW.DE( )GB7\DtAch:JiQ̊.2d *.pA"(3HHD; vZ^E&wGdV3ʔQ+?\k/m')(V0abQ-noVo'ێotgEEn? Vm"5+Qa4O}[>hYu[@au1Jy9x>}k\SoGldg"ը 5Uܹ8IbXR-h? oI.NJ ?S}bWq}J%42.|}=~[}%>T3-&~X - z~/o ^pк] Im;3Up?)"gWbd.l2e, t=l[  `vUE(Xp\:ú !Ⱦ}.J';32,.xP#S-IifYR<LW|?Wr4V@$g"֐az1š][z6|G[=GKd&}maWhўِ\SIBQdGV*֮ XspZ:a̧+z8A _~((_߂sIβJGgM!KmSOď;zBv3Xo<`Yye QњprgLLn>Gz=h c4*<)OV+ܻ3qG 7%u3uWlOaLG#/dÇWt^lf*S[bOC. 2n#&t[Ģ2(h,`5u>F $ʠR(riJ+¤! ( e5=@$ Lc,-g!)f%򶖳;th1kעe1@B'/M2A/'[›,xyG2|&_QDU7>E"srKA29vJ PLpZ9fR+R*<g1BG*Nr[P85gqم4 ǩ 2t F|/eU$R+",)U9l$)a`Hd)q:H BJ(2,g-rČF!KuD.yi#,JMH @ʨ p#ם)w/g~f7<:3FPty# ! ; ^y1ZxD* q*bcr7}=hC*e cv 3 l\KJͤrHYӣHem,=LzAƣkw5{/kN|'w] k_TM?vKTH>٠Ee"j?gj tXy #R{oi!ɺcn%zuE E`"w4kZGobQs)ʑ BۮB}lśV |PHn T2^谧)52g4<J:LEWZF|}NEbcT#5U U4Y]aKO& ߽N{7\.%6!uE2 c*5( x5LGgmxϏ)Cë-۲:jhTY N[c7Ŭ`sBcdwlz4K l:hXn\-7aqYqǿQrCqG{2E$t Ҧ(Uf)cbqwLEIMFƅ/}gD,ASR0F/@.S_pgS7qy?/f .P+-?l&/8[h1(b֙$E8'+pR0E|ίSwn&pc_'S0s7硯{J`*B[ҁ76-W DY,B+}Lz" X|XloCAL<#Sb_\^i,c˯'q10 ~ҹV:F T/(>c|u|;t 8jH(wp)wem$G 2`<Ȓ3; kyuRY8U Ш#2ˬ,E3F 1b"iZ+I8c+ g7ۤ|C_io0&d>CA=#waRO8tO7>fľQJOGP ] z>lO+>_<C8? ʷ+/so,mWK΍zv |“ -s!vymKJf[0B^:O^6*P+RF 6;˜̛-ϫB}7ޒ]W'^rr^oxf.uc=9QețOdϡǝb({#HF7do${#{#t.Cй8t.Cйg.Bй8t^8t.Cй8t.Cй8t.Эm loJ>ЄiΜ-ͺqUXc"C'XDΕg>Z։ S'vʩMTWr0P s<ݩOkg }V}+PH^Csc>u׽Pі9mk#?ލKUڌ!'ƞ_ͦnn(M}A?(uP0Q1AyJѼMw+A:^7Ujk2:N/KMX b'@g=Q52mqf)ݧW};rxbr6dKp~?LiMXծLUmS׽B ߫|6}~wW~}W~|CNħ`~mu M5Z9 KiS_e>:d0|qG#_ܑ/w;ʤF}DD|M} ֛ d7-0Wm /DhC K)FDoAs!֐\ߦƇml;6,2%DΧyCь*ˣD[魡z C#eY[XH>Ubhp8nbi1g;Ja8gky|tppZǟ2ETK0SΓ*k_}@fD~2ᬎ);L^T`$A(@rGxĨ>(ᝠ cr22r?H8+Cy5 (vN|#7Z*~IReUsUiיboZpeLy /!$패+8pveSlHevKad,덝ďRNq,z̝vX-"ro0CJf/h|r]BEEѩEgG]e)Ù~~ Uom|ݴ}5?FO*lZ( gލ4@п:#viRgiCy TbJ8A Dc}ߦ?ZDRR-QDmgaABʌo#Sx UW 2^!7 a2њ{"&i^%li1*e"›>P&hsͻ%e,Љ]@f扁vt٩Q!t7Le72Gqiig~ηw9T%٣߻ٖ;URY ҅vWBtUBz's𸨡 j]puYA\CGYgqu"Z#Vw|$uOǷtwq I)% )Wo& Q};f)odw|wHBDȨ s@Ȉ-bQXxj|\u&Q`ۭ"tBI @&+T4h@"D4LeFHwJ)m q7SN,/6W;:;X4RQ|R]/$d,1*cbR6/ÔGrXk*,Inf.S|v4 ǩ 2Ȕ{@GFȻdHVDRYSHmrPI(aX+p}F$Tz:HWfHETB@Ѵe9km8 ݊p=PS)! 4j sG+4 0/AP2*D,msYE@Oy`'X&" IS!F˩^K ަ$%*͂';a3j jq/IOZHyuhkk%ӥ؂{HzFO%uKV(Cښx&uh7C:nb9[ Q[pON3RLyK 9HPC>FOMIy3o{%"@ F/pJÃ&|y\U|}0s85*0F: lp6HF7g[ysQ >w,ͼ'Y1c'-Z(^3 e4R R)pdE /RnxrËznxSs8>VJ/ր?|=F,%SG"`"RSFDD bIV 16.vؽt:r<,2: 8_ZCeÏSkKi#VF4m 4eOj2:ZۢrD`p'!6Zs }8콳]_1 pt);S x+3,} {g0zEr{S:bQ]Ga|^eb/~eĺӱ3X D%͆ ^(V!@7*u.^їuJ{|ԛ4m,-Az{дϣqLwYz]6JǪ~йn;/ʿS40ѭIҝн \ I}wο2".!l˕q^g1Qd=C^Gs{٩4㍹<.̘wrxʏZ!~`@?@gIg¦_V&QF>)yL*8Pּulel`/S1e9pvUNop}3VT؉ZjkX|iGN{:SfѸJnq/M̅>MnlӴ}M ^"-Mf)oF$1[O6~{[%k,T+ 턪m,ʇ:b-PDmWV[څi%ȓ6x}MF;şmEB= eNH*2bޒng!/\YR`Wp~=n#ۖJcF1 )`)$$'V}* {< B D$cL*q.#|)NXy%y8 Yy8l&ė!5{UAdz-pxt1okVZ~tzG p^ #]<,2O|Wέ~!B6#9~z'ˈ{H11 szYC v7TȽ/UÐ z!ob~^fу8+7yb@2R5,o?L|/MKDә rvI6"bF\0kxt;BVBYv~C7ꖠ;LF"(ˬFlٷCʚPL,<-yFK鄡o.?mg'v`fa|-;^Z =ȫ8<JB@iʇ_ BcS0K'X,=AQI%huO$9,ORd9;(KU,B` ֒H*N9Q9xФ<-vS|࡫tÌΣR2P;}W, GM(PD3j@)` 3tt(nT!ZԌu?vARH+;w)܌;dE,틟|Oq~m-6VK+u{zng ^k>5~^|ˎC9,7 :Mt-B+1RՐ\xhF8Y>lʤs6(vI޹1>X&~sqoEk)EuFK܇49I(Y .>t>k0y8L(R:Y ")2i5\:/ w/Y!p71|@>N%`FE_`}\ 9r?{bZ?g98JȶH5o\dC"avIˎȺYBUIeƬij[? }V|qǹCbyX@mCČV8:QhBE Qe( gvPYt؀I-XûCƷ&>0}-t}R |.r01v) +f:E᜵bÜg|lŹ9%(-|)O 4Rrˍ?S!*z?x&HX=Y5[oh!s/1-scYsp`0m~:L%T|eKF4Hͩe-_[~e#\IucJO=BF;O8gARWZ?k#溫w6q<# #(!HI9,B3OZu^FR D͵l ?Wkvf7]q#˿B4]lh}fw{2Fe&XU-KIn'W<$ZDJ&m]ڱRݯhQ{(?b/!R&>ѺtP8~= w 5oW:&}o^NΎ=uàزyZ/*Q Vw4՝"^C 1/ :k>e}h6{]Xl86pM^Yy * WIz=)c=|$džXIY'XW.'ߧgz,4NsIfɤWisGO}f*-:te }֊8zy?Wj,vؽ܀~xӷP:=ڣN:P@S[\NX`70P<:)  T6Sxz2֕vqtq@L1S)4K!)AZY˜LwcK핷RX%hNDE\΀&vyGx"Xeڟ-,KsI(rV)bׄxG;[TA\ &c4o`;/9#v% Fb _D1cgv3>"w}.n| zoY#lt@'I1f{蟩Ac˺.Q=!2xbIy@/9Aǣ>s'cŦr>JP m 1Z+}hbԌs̡buZ d&6/FuxPv8mm7n@^r=r(÷'?MBWЁj444;v0(vS}WCz xNvJĚ},Ysc@5Y5hM_@86"PA%p;lOUoSP=wh;W]C Yzb[/5ź֋-)֫u/]X[yKתh}<|[sc'5s-`1k +cǒ0FAac-z8TXfTz؅I|Ç-`/'V t2cŧ5  (j0f>Ft6p!_`Um7wCu}Wxdp#18W1{7e`cZ`ȩcw^gz3<C+ט.~m'2(e a-W|?QQ;G>"SRht!m듉>S t2!Ԩ-+oߘ!S!]} }9KvǮz@–]XN/ Zq)û\?^䀀n5Ono9b ߿:W_@ Wv?^q|~3of_0~Eҟ(qw?LV[~ &4oRL {~n{>;n,W\3nGf,~~lꇚr>߆l3]Ã8AU9{/U2ԯ^ W_Z|\O+p>kAЏM;<4ng (/AtIi8 Q3cS0f7gvR%\^a$Ux|Nʜ79uRSFD0⍢+Eʘ~żz9*˕^*iG.rU-KNS[w_ Pr>A쮥K5rͥTXWM,\ރbYގ[l LW|5iK횽ji ],'Oǟ'KM ږ~623Xx&(A! պ_+4dY7'7X >GX;))JnpI;KIn6 7AGfyle X}ui3^QļS;sj:eaGlȘ11<ȡ;+23`(Zvt_5 kQԡIֽ'ѶCȸՠS4t;!)u%] Z'`܃ Ky=f Tx5*Ѽ_߳?fp|X]gVG^RU E N'y31YâV3i%Tz=/IĦRCYE aDX뫂\C~S%dE-x;g_(A{* u8xrԽP=ߩ9d½h-V'O7N/JJ{E/oIS"X_$f+d![»:KÚ4}|))tV's4&By1K=E$B(}Z,/1gZ2zi߮h&ZM@N$dLJHr/%Wf7M;,;J>VIxY|S"9brإDh<B$'~|.i|cy gP0eNi=4 v)Ge6Wid*;,=g6`\q9K`Z+!N]Y3LA˕%+.y ȅt;mEff/&X-fnfƒձVr;&M\"j ۢF C*퐶-iR3Dbp+`\4-M|}SJ`kY{jli rјy *rvܷ]x` P?06[8$ G{<.}y :滶.5*A%[|ӻ|+ 62( \YroBi­ A2zf<{8I_/0 lI}N)ۜs߮&ӢOy43Ϳ1m,[h/PMbg_R@hkorYKmS/WCSig,OQ*e 4ʤ`)%$G7 Gu?c߳ y~ϮTѺ{V6]cMr1OU.JZEbem[Xw/8@tSd1_C W) L/(9ӊ*1o?fߒq@[ Gk00S_. /BVYӿuRN2qbx|P5|sY>k=z.O&ws"7 T)L3gMi̘$Gic;lhjO2#b[_ _c=_! "5spwo`g$_̰ȸd8w.1[-8vmss"Jm3HشhOzEχJYBK)&[*tΈ3ީr@jd2TќC0#f$*,IZ GiϴȭϑjF9+Ⱌ il8f`2 LYKfrYA~I+o6;^[?ɵKO];zZ3X혺2,rJau PwOjyˀ6vbu0:-GXa_|P۸i̛CTYgnO!%q|^hسc*n;rƮsE>%w㟲 0ǼϢLBNΧN GqLx^e\|^1=`K:Mf`wC)° i0dirsU:j*F9c~,?* 7")]0k'/iRb,% y㻤ͷUvM2#[ ϒAq$^b"ܝ? \aoxYEK=fy,i3愇IeiXɑkL0sRJ-Rm\ &w0&Y.y츘YDFrZ8Z +WxDSof_noƹ*~*Y2&Ta I&J4` · a0e>3W~%ym<ͰI2e0,K&&Tac咦.oqtɌrw0҃}%c"}( bR3Zy%3u}<|pgR NYhQٻ6rdfx Ẏ7A"K^αb$ˇIHb[E:*Zc Shޠ0Z%MjO:Z\YKd -[-DkА XYU,mw2Z vGPhޑD4@Ȅ Ub0+]= (6$ݰ -+%Zk{d@~)']- u ˹4qN AzokbrX-1VZI\ A<~ @kcpP=e%gb#(ONN뙗YәL阋Y CxǝmJ5*PMPXBΊ)E22?B B4u0J \reydL -#7|D ,XsR䙫Xba0&ak1U l؊uA"n 9.L ~Ii@#lIDOVݳ@?[*?BTu/9gS?AeųM%^Pws7#ɣ$)q"|84.H1zH9[DJ$|N۬;#],84~RX[{X{24$`.X=dw{{ uE>Y먍 (A}\bP0( 9U $Et1h<w H 9lBwyCk{EL"X~NZ[#N6[aM8/*٩AzXh$G{H2F). (_ƁT_Z(̶HLV>Ña2?}׫Z>ȽvO;wy[5bcTkH]Hkia)F1@Z0kC4%pfe*)S00BڴjSB{$#I fflΗ yD&&в /q`o  ;_7(Y.4(}7r#(жVx*ggB_S13+ ";K7 ΅iJVRXYY15(SCxe^W5dYRWf)($%f~ &hҧɎfH&%u=J Ie% Ax9hRVR/[!%%NJ)1ZG%v_,/B1)go?kc}C$ kz*5VVX֢nR6^^x܀:Jby0 <23 MH}Cx<dɐe^E܃f.KBɬ+=?L x<{ k4-s^ -[c['ʪ*YAO LJ; EAN51hC. f5Ί'EQ<!;55 z+SFga%)#)ƂX'Utl$ٓ dJZ4( ݻ|ڸ r-BϒN__^Nu fZG ":b:K0%EuB]w=O.,oKU-O};>,"֥Y=-HK$SopunbٺI-+xk}oAS:XƲKK[e1s˴8]cWm1`x:IJu, ܏Q)}EϪ|$w\n[4_8jo/amh/&I됭y'nv-VuoRێ^m o55\>Yܧ=pB:>:sþugyCtS סj0Dj*:[uh8KR]zuq$ ۓFv?'?vBZ}|{IfgΜyM_crlG\w9|GK%nodW3bw3}l3 ĔNGQ*p1}X|8w*V wV{'{-#8np$gi(jလ~{VR-2lkӑE|9yv7{89iNp݀%gxZwW钌zOvz%Umt$NO_w^ӯ?z݋W xŏ( 9y9{wxhܡiժijo4Z9ijGv֮4x]s9 +kcPPBMsD_/J,͎\U \ %$^>|%`.CZ^Gkd?;Js蕫2F+!y!%.$N\Y(F{3d,=ٷYތ'|t.nN+'y}g^{%yLz"I(PʋH9,.-Bx:gVYEc>n JVy`5G *8V!JLFRW5QfbhM(a;\)LJLV:ZHKz,0jS`tE hIWssj}A7]yPjVö]4?z_sx/>ϋw?h֎|yD]}U?~vkOO>9z5$uaH-tOy+"z{[U-BO%?<[esI(!`$0x_0Z& (NJEP5pF~cA, "* %:@  =Lb}㗧Vvd4n?ln7vڋ5++MV3%QMy2ZxDhb ᒉ$jPH. ^87F .9@{ vx;g[c;v'Vk\4jnrѷmHޏnj?{WȮq^1v>,.5Y|[%Y-e-xͮbcah4èBrBg!&N Wd\4¨ͤiÿ]3AT׭6dGO٥ܞ7{}Дg1f 3s<ծ}㯟 dOU l`u@H)KI~8lS I[Hޓ8S#iuLV[Eyt%N-]/M8ೢ%D.͉pNg޲ph( B:xDهOt"IOw *o'QHTSep]d?fݞل,+O.o.O1M _bwjϟխ?T[~.Ó 6](==5F ^vWSߡDL.xYҢ 5Ppn/pYԸS[FݠG; ˹D$oprn4R)*IcX<SLʆK)HlhtD0)r[;>y 4N5_8#RPAJIњ̑Ae=d9}Id,6zR4A[cx Q{)DmaaDǹA^=j g;+6 밄yȩ-0|@z1=`*;SOX6PskK%pr cISB9`l^sinCfTN: x!oI[ o$$2D4YA*i(g䐽4B 'AN܂A vdcR W٦d֘$NVֳjQΏc t\8+C/( q \i?')@{I:52Q<*PS_O vu$=aVhQIϣ3ن9AyШ,aCjP+ 8V(ϕx_Yͤs)ŹSZEEiu"LvFӿJٗpT1gE_M&x,"]ܴ:F,&".I c *V;鵭^FVl);JEFjȾ++ԃl EgrcFݡLEƀDL*3ӄQE&<=vT}CT+&AGoq&~#jҴMik8̦dVؤW?~B3W7nβK͞cOr(uUK0 % ma@eHYr1,q\8-XF?Kr yU Ӊ"A>RTֻcA؜<7<'ʁ\ gYWǣͯ]~}fsoU)0rC\G,5L-Fߣ:^_2}sNߙ p3`}nsІSmO-M/}@ZfzYx?BwFWUtԛՒ~ ;ԯ>ݕ:N /E-6w3%]m7^/?C/[kW)Jdkh^ݞXzdxK :NSHjJ C.6l~e?0J޶+:=m~˸.VJ tvzp#0W['TLuj'mo4# % #{f>Ig, hC Çbmc75,ݔwcJOĎ.{ 7 .>,;u҂.Nz/Z< lREuR'P. E)oI/M6R\ *G[VQDL(<7"dɔ%wVViuU-2ED:RM6ٽ\xy{٩VuJPjߗ^'*J^j?{) ix}6>fʠ h^Fσ\̟"$3 `*9^%kJO,ˣ'遼>[K? r 5؉Z1S׷Al :BB7q7/Ԯxdcjϗo$vO O?p9X.휩c &/j nttT'T6 ՜gFYP $ w/54r㼵 S &$V%#Xjys^Jb?d_Td)Lj!i1++AdI i{9g- A#sV Vh١y (7Ri{hZ3۰gg+7M},5?jXj+HZbu{tttdqRg 6˷'^ڢ! C+"tnEA)r92QQ40LE2AnO2iT赥Y)^훶l\(2HguGm/+iZrX.k('N='N|JZeaE]J,Rb qEf1:Sx#.AUq/Ȧ>) ҏ*!(0s&y.$E/x+&/ D(>QE4%:!ǝnŗ0MwPOn7vŭ%mOmx& %l_ CU=Wz1fDf'1K.HR*C}6g&E͵+𙴰]sM^tXGb[qH8LD'Ҡӏ Vf(|Q0((vAj4hJY^&XR\sHck؀" J*Sər (d2L1E6h򵤄ǘUdm2аw)^z"s3 @M $wY.)G"a`!餹Ɇ@pVBu$-{M5״cjEkbɦDt٧<6>9ao}W݉qFl5Mb,7o<3pY/_A Kjvl)'u>d}.@m,ujt=Ļw] r& {etͳ/`h;ƣaOm&q'ͤuEȜ:ggֹ]vq Y -7C9L[SAwZF[Ri[::Ftdp\4g?k RT/^6>#ENY 9j3[!BϚl5 m z8(+;B?^;:sҝszYOκs7{mKb R1I¸Ȳ/js$GG(Ͳ8 H#PIC,4VL@Ӊ̂V"ZSܵC5۸>? fVSVc?mC'&4]'tO/b-]rXN^m,ʤ4A(TƳ,Ye5{kAyN+Lc RY^B(Р@uQJtLB9C`q>R CQ9ǙҁC7́0VĢ5& dcؤ/H(}LhB Il˂WЁ&@Kρ>bd)"1;Uk{U+uz;'#<"(B 1$\xt&D ϩFkZy6(%%^GF)*sE~uzEIvt<Luې ӁIZXT^if YAvM55YR)ФzJ];ri^>%Ԧ(gS*DUZ-2xǓD!&霹,ǸV9kd=끄wԯݬg>i;[`ҽ~N~),כB9;E2<|Na''p^щuhؐ(17cwWixgޮ@Cz2`taDζGZ0֤͑b. ;7)FcO/7?߼֏?=uN$iiBLծٙL2x۾7$" :m&$vǔ][s+Sz}#4.Wv['/qpH-IɗTi IH"Eqٗ f {H "V+4]UaMh__ J4N%Njӝ) 2p՟T;N^} B!P^ghF2v2|_-xX5ZK/kLjdzljMBl:~C@zC3(Ҷ y1eGE~*>{ e Z0`R4A&nW̯xU56It'*h d/ 5 9*`m:|>|;4RKl7H¨D"N=*s/zm2хɃIJn6-Svg~đNcep457fu;xq2U0+?OnP.ޱ;u* [\ऑf p)do  >\%l% 8V(Pw %vtWSQ|j<jM%Rdb1;#ȍו``b{JYJj}>7թkɄJDWӿ&k4`&+`6!Ͽ4O[^3[K(p+k]o/YgOͅgO7ff&stziL[a/i;("Vz6j>qJ2d$X:#,/Fȥ2;ĘzFG %_^9nZdKGQ\6ҵK4KuH}!\؇?^Ps)y ƍ>U4q"ؠf8Us=? فv\(@3I'i"idI5DxFS#[32Nh. YՌgY֖ '[-&%mI0gErtVE n`TxzAM/( @I YHy$Z DF݉@"p\WNLy L٣pDA KfyceCP aA֠Vuw~Oǃ]6,CPL@E`.E #F`,I(E`)hEq<Ͳxl-x3\ug=l9F.í j2D"@Pg͍K|'mEDhn%"kFbuZi] EIwF+9iι$#gM$# 3$ bgZ 04{,+eclX kp?sЕ-usK=9$4@Lv&7%%iک5l҃ k%A-=BJdƈZ:kdI_4E2 b:e!Gr.3Iprlm:I'W!5Cs(45HWD^m._侁א ^(|H) A!D|]r夌$N)爝%SIaTLwۊNsQ8r/ Nܗv^9tc#`J'1r拜s6Mv8?ˏ_ă /4ۙ7'~؅o7~mk8{8o߹ `>,yUX :^n=~>ocoDR:1UPY[HgjUGn/TGA_6 fk0F)gt  ^ ET<[7ZOvJh-P r93ٞƙgn3gXc $Wb> LZV"ѻFoZON_Qc3ydm%eO"32P:~J/t^(y{y̫fp~#MwuKW+8Axf]fV=&<Fc@!H STL{:]# Αr<*UgWKUmQ5r[dlW"!Xo (a9)7ea78fLd#D@1mHB H|)&pv <Ȑ  %,Xx9mFBŞ:.BL `Y̅NiƂ}4|9lHBx$hPG!T)r"VS% ) N1VbTEÕ +> \ |IS'f#iCf9DS{cĘ)zX`lɰ I8>ni0$L:m99ΉH>0F 1RbehB*Pz|oh|s$ 5Qn!2($e3aqW%iH-"Κf)eI3/  ]ِIAŠ,JIM di>^RY6*ZcD_Mg瑁yinL;wiV.ݤn=-~>TvqE&r%-5헔UӰU[e׭K9icG i-"h 9ڲjE& u%nIJmim&x[KNozFeWq(ORiP͔k)fXDޮۧB)'as/S_?ݑfhL֍JcM!-ip2tJ.%f9 | { R&huepUia}F_wAJB>etoۊ *YAfKʫPN'H1z]6x{Km9CxIù]NЧӞs7?A]T*?&kct$&fd+cGY:%JI [bvҵt]̨mtVaKja$Nj?w;_[Hu0"Ka.r&?{WƑ$y@w z>%)R!%_߿!RCǀ-3쳪WGW dK&Z,&1hl2BɁ}F6>8LÔA(3%dQ 6j:k RVEBhonX:x񘳜1m*.ke2%91RH!Z@6FC!B0”b2 $Et1h!ulLB|+3+bmsAKk3}cVXN`{8٩ tX|0_5AJ v8'ag^GSq$JON`T0C!2W\*wND0%5{x׵E?w {wPJzHlKci%Dz``$L$$Ι+2Ypk%Ftf8: 4ǒ(M4ͽAo|6ᠺʡ΄T1UiG߽y޼ ? ouwhyh%nu>gw8>N=*,-ߓfʆV0?1U? W~ZoI+b9j^Gq$Io||W^*0{Ql0_?mr=ilO/5s9'D3;/0tP&wWss=O%ZO )ι~;__\?4w Kxq|y~TJYCǮǗQ-r#lc|j#Ίitr &ejiNӷH6NV9_{:_ՍXݍ,kbĄa#h*|8-yr䀭U&וsy߆XU#^ ';ל^KY)V^}a~'a/g7Me r"*;R GQ򝷥k3*̚j7)U]|/O^W?? __: }Xda!snuޣkV߬kia].JW 0xb C<  *޸ۻ\1Hh~^]W&f+p\qVPh3 ~~r}!+2SY+U4N|X?_ËWe-/LDJC A`ёn6ATxrCg|tL >$ JK6EBHD^b4df`.0'r: 0 ;:lwsBŎܺa5X2"wy}CoBz'=͸x'ҋZ͉yw .F4ZvVe$sFjH?ocqE0ɢbRhrŎ?6.:L7{\/v Ajuѫ^⛡-mX 9.~P#QNF~M>Dؤm"?nt 1ĝmu#M[Ȱ44`UC :-qqkF4ǕeF'LO 2=Y$i PD%ApP9J,Ud L):*2׍ߋ Vыj W@="BbT"o.>;+ם6sg B|xS1uļPhcS ͩ s|Ђ;4W,nY Ǡ9X9Ik|,L"ʚ@2(l(AˬM~c2{L0Q[c8ƤҠsDH#&(൑omf;]de@ ][v|1w7T{y<2SRs_)n=-2FCΨYk-48&T3*Hw|vlen9&[4hNx#D2cH QU>K#Ўq"dJVDi8&uj`6[Dc(we>km8[ca7BLJUZB})%]k*JC*%I$I ޓ+P#Z-Tc6YsBN?1{N0 ͵2.Lo<$P$SH;#mˌ!v`*D2d2 rZh3a"{ؤpGkP-sWrF? ǗWzs,nD'$tQOE {vMwфkߧuQ :k! %!:] ke : jߠ# Nn3h gdkOkB9C'4PLJbD3=$uB^vS4X!Q:eƹ \g@E"Z[Cц,Zs g3);4GsɍYڔ;?ߏ>ZAH.k@/ 5gЩԁ]@{,GA*AM2m+aK 8t'lPKIrH=9ڍ1%0YT+w)))—J&pmp6 a9VƟOb.so1 V R}wzw_>l 3>f1:bIN!ѩ ˨^k @ZRf꘷杢x:Ǝ"` fNYLAQjiښ+8\dSeqqrn:؋Nj\\Q|XiUn0+eN#^?VhCDR?z@@ѕ[6p>q]<w!BdV{*0q;ֆ(N!Y|6O~qa>7wa| 㣴ڼcv]g f^ALWZnCN-ܝ9 7eպxk].}@ 5R^܍u9sjR=C˂vSB857*;zT`Y]jRG޾Yf!cQji2,{lǫ$+t:,0)- 7v1YgZɑmA2{,p8dw$Y|z|{dYDٲGMbUb?Z7$<"w Gpٴy$e^YuBn+]h¼#_y_}zG)K8%DTw-)5-@X6e[ f~Cyk3 e;ā=k]uY=Z`%ʍAl]V~ )Q"oyu֮/UHm+2S(4gD_.%p,Ϊ绳7;wgwn?|ݣV!K&g&J)#||s[Z2ޝZd &"  +AfJҎmfon_}TڴR֓$?݄gco}WkufQsyg"q]3RPakƁ*.oѥyqi=KK2A G|*y@5&RFljj.T3J'+W#69%LZϢ~ɐVHwB "sR&  b*<'aR/1",Da1тFQ4`pHt)m8MqɃQ>a羸^xdkh6: =k_Se|퀤d"kPPB 걷T0" QI(2BbOY4*k:@SL͝QFumT 5Ptنk7dr79?{*qMX[bYe[z)EIެ8`G U(j啨99m5VbG~v$>9;rɲ]=Wfݸ;4\VM'OFhru⊞чK]y>[oY%mΦ/QRn5 O(^7&wǴkmk{oM_K Q^}khU,ɲ.^?f  lA O[`yz0 *uZb{7 #u]pl5,,|9`Xpu*xfP,Cxƴ^9ԓj=@hAfmsȸ 0tHSr+."ᶻ%j8H}݆O!EۦΎ<HmG[Fm'KwƳ}>ls 5,/ kx!u*=3@nr4A? ])*eֲ5]qdnyQw/lx2U64IѾվsAQT7!ŁʴY9q8x39N2F\v ^""upx*[1CDxb$^!`\+Z$YrL'tS*ś7w0zɬ6:K!?./j l%h4ݕ:f:{pIQM)53ZX'nw1YoxG16 *_&n)O5y71j tfx*Z :@jFvJÃ&z;֯RWrŌ1(.-ؐw`^~ Bvƞޙ? S:`k搮-hɉuXT*NfQoFw0%g,:u}?Kսb[+V'_XVx;OTlؤ5Tnj&^V$byg;xQ Y5B̅uÿfފjlc'ڙs}K߃i~3D =N7u\86.|n4)bM{ɴVk*;R&>;;AR2o Yp3"zB LYPCʑR;H(:**ܝR}IE=9z..e[|?AIӨy{F6EZ+tNmyg2;B9t ފR:x3AyvQ+x^;% ~ާQan!]dq\(%ɑ`w(,zXlA򾷍}poK(Zq:X,$wٗv'A{W+rqxA0%dWjnI3JONDRIg$A;-Ckv5*Cz Y@SJx->c}rHxDuDZˇ(c$VCh}*8zm r#໫+7-}6?' Ki쿯 [nX/4v>vaPm  .Ԇ 5h[obXdB ^DTcTA/UVJ&vZe|uY_o3bq!B4}5ܶM=p(,pkk* & sJRNk [ ! D`Jɗ9;3NB)Fh.龚Be630:dm&8b7NrM-5B5{Iݓ5sUۈUKdMdzV] ?z3K:v]Fi8dW`Pz`3-½G嶈'qFplCYvUǕKO9ṯ0KNRf>]Vs6V%tﯮʻVF:  6! -A} zJt'?[OXPw[Te Y*tG8(ģug@4Pī$Fc0A0"*&{+Me9 A <kJYvѨHvGs-Bh  P8:}!.j # EZ6[Ya-0Æ $q@Ry8ںH"K֣OC!L7ƷND3jnoVOU~1;pK^bKGtL@( kFIù3B- s{Z4;jyC9o( 弡7rPzuTV*^ץux]*^ץux]*^ץu,,Ŏ,KRT.KRT. nJRT.uaJRT. (JRT.+KRT.KRT.ZTx]*^ץux]*^ץux]B .q?%С1C#ܣcyvEv&=yc> `~ʕ/v4 %JQoYjzY|X ?={ oKe&#QHYu>k5f,`j2b=6MVHKD>ܗcgIƐ` k.(qCvY`iIR QjMJ%R.o^56dZϜFlξ\Rf=&m Fh9 t kluݶxqؽ|t}n'mFG=/r;? t];:/&Ҭa5MwX\ ?g~GK1˭/dK nd/6|^p31GVLbU3oRvghI_IEfx-Jkt–-¿e2Y/f:{ay(t/%aB"G 0V l'PG 歔a䐤XHMLcAm9^99MWP7ֆKuWO@x?;ި=bv2#%>Ht4ÞZV[fuh yQ' ^G,hh~ICp0MpkLjM#9hcH**(0fSjT` H w{;y>wpg GsFഥ`@1h$`EEc@asBcdGF%ZRk9عrHF@0FUfi%=KVAyBdd?l8z2.`4m,2VDIMr܁>jbkm$7EKMa3ٜ ŀGYZ܀Oud[le-)6bU}U,ַ?{qϹ#УA,ԢfL> Ω9&c`1QYHj f:n\WSa~}xZζˮ6LהMcg?!A3qtb tS:/G VG43#T%Ev&i ӻy ]]E4×'!S>epZE2%vl-He b4=J;[.hWuzh%ۊ+(ծ ֞iz0k^ÌgO./x nXi<ln_J˪'oއrrG4p');Nn6pmV[LIx,VK>,&zzubáxW%uYi!#ƚ&v}9eU5"~,`2Sw%y-im,gAUk(J0.qG;Gp߿]/+ϛNsBcMqnR<8X1Hh~# s]lN5RRT=vZbqF6 icOb&GY$9z0}dr?JƝ)jYɚ'ѦPP% BG}1F-ޣ# Mش"$z*NUKQ 5$hu=b'|9(^(M)%c& @&bM߄d(*;[\A鈢GZ:ӯz|ɷRzltVɈs:ּnhtDdC}s?t!0MݞjluMQVUROhs<-ڸVQrp~%P2F#H)ma]!#/ Ϙe@v4<{PKro2k(C g,Į+_]r# Fjr.]5&dj!`iVgM_xiԱ+~2nN}l>t|ϳG݆a/ B-;RӶQ3A=V'zyN<4]=#Y!E9{E=Do 6l"j \%%'h{xBZ~c%';{ nO(GC]+v,Mc#gBZZ+O־[m>=vhM h")WuV_+<`Qm48z6!#`K1@aAEtQZ,`Rh+1AZPڢetPQj@2$cgù{u䰎="͂vǒ Y,5uBO]a>|27n-jRDp 2'^6Pq6pcBC2Y {@Nۉ+26{|_mrGWҲbܛ[m<crݳM*eݟcS?| h(ݠRnMk%V]'< 49sȃ Aavh1b0>PoHF`@`4Xcnqu_Y:.J}:$ySdTs1c@cj%((ΆC^>$e.D&Zyk+o~8K8H;WӲF_?ut]03 bzLxՁK.tRQMd}}EP)3T `bm"T=Hި1 !i_>k+KdZ(krΘlb+#Ʋ6*'Rʢhk[_P DQz-Wc nYЯVYՖrӦ7L̏z_^Wr|y VkYsQjSA2ظRcr^Ur_ը~P ?h,Dmf):@Qkp%:[dX\E!RjpTs9Iws62t"JJmˌH.zz@YE襴Qcd\jAծw5`աT:){eXh#\n(/0rv5*J*B[)%F.aus5s %B)X X<#:^r;fNS$c_ȒY*X`L[abNH(~kuT  BB&FePkaW 0))(\qD Dбu6*lY|pظeԗsҠ ~)<+O Ay!CjmqBU!8\͋ײz=?ש BD+2X%RP9|5VG*dv=t:ZbIGԆ` aQ{Rzg+YƒJ1[k IP)DqT+):"]̴V&X)c1JJg'K[HYPU2z§""䙒M*ͯy"ó5mˆP7r "61DՈ?I"SUDwU"; hB B֪$P9T:>킻loxlb(}t P+٦ؔq;|CPA>w! 5iAcUXR3[_32.YCdy"e%Q|(El{|&Xֳ$SfZ&(\MENJ1l8wCd4L_V!m=oP0Xq'(й 2`.+#8tr.KH .G}%(<*[BR~pVI%Q Wǣ>oAש5&:IWBQEϣslSΖWT+fc}r.fez;$ceR-yOjYK8DVz Z Á-8 Dr5:M7E{ZTr3h EDƐRa Q'X 9vV"ux2BgE8oUWf&7l1ۍJoQt~Ja#<ƱGGτwN/G|x yfèIgT;Yi`Ai@: +ғ?U)Lx90*HFv̢^ o4,'Bk-LA%#ˍUDiV?kgpba=v5[snus[@綖I`E8r6nR%sXmT5yq̟|/.Pm渚&vKIT+Og8[r.1 g HThl.>(R3$(i H1 ZLy/];JrD..krC1ŖcinY&FO|T< CVbT^ch0[-kF'ɺ.t5}EV&KdT"dY21T\2lkYsY$dCBnzeɘwRc*U>X_msLɐȥpV`%&v!cB̽KSJڋ'/M4]iv*9yꃦ{m^;ϔla롷'Ntr]b%q!!JE&z[e4Ϫ*l=z8Q4EIhuPNR &H^!P$@IM}mgν ,l/8mtnfznw;T~5ȭCNjO{5g]^?R1knpkfz{<.]~lrˆ[.(wmzo|Ϋ=ܾ'--n?Hx̻s@qC,מz˙\C,/'34t[ϭ͟7|q,_UvT#.$3Dl*W@$kb ~@_#=)cje>44-^Y/x>ZyV! j%mw% ƚC>I"[NW~@N]:ˤxײcd{?.\(Nh0DJ> BQs zd&QIrtP LNRj( Y18:ygk5!qȀ`] w_y ; gl_覰$d+mHMˀ?dl X'/k}J\SBR߷zx4E5%Zn8=tU=U]Bj(GYѱHE,:!-s$ѯ<[Q 2$*sLPNkaEa.!Q1m)瑛啐 NAgZbrab~6[}00Ц/sAZbP˦l9#@oSG$oeb[}m0޿g_/n\WZSB ~>PN99:QsL|>L Łi$" /BCH'G]+>\Ipg\?H>8S*-W痑 1lo?sܶdJfyxzvBhb`rj].sjwb9qeibR6yެy63⿭+??x n23?LlmZCnO^rqq R{:uwá 1Qy$  =֟n^b]vUY.I>Ȉ  >}~We1=և+ka~/9NX~0OfMQM|Uب#_lN5zd>>43<悿:&cQGODk.2NcNN#Ǭ4W%I+5 # (o0/Wur1Aqѭ# Nش!YH.'1r+L6 *k \ aQR D8J Ѥ5B9d)'vN WQ߱CGI/co$VE#sq%4kp(iNyRK9Q@8LA̩Xڼ+6toߢ΍[Q,;/!0:"F %%54rg4x`x 22k(ruX:]+˻97vYD~Db Azq[އ^]~n~~>;0MKq3ri Ӈd\-_|4y*:7k / Bm;j ̗mB'zLSCBߞ q~,43G"C8S0=w,p]""bag»g7/חV{m ^(L`0>8Ե"h׊:i=TJUi , 7sj728bRZ0xgDZ|z18:9u$g"&g) tdIEM[h&TzSr[]NaN9PAՓtP=Ѕ 3Q#r&6)@ )AiB>#{S8(b{xwCUL%&r&afF{m@GʒbZPR[9-.eO<9+Ev%>Zj I )6Br"`c,Ort@ܹ(q\S\sJc Ty+uF#$ VoݹI6Y&Q]-' lD&Yv.B4i% i7odô XgkHŸNzTޝyEfv٥9P፧:c\k\ "8O Q<(@qΗ½ٲ+4ryA~qثbndj߃u3C '«9D|0@jvVwImHAjuctOM<~PuG;~ U@Z 2A(%"bZKicsDcAHkپ&R3֌wf}PEƯ E턅Sv R)ɠ| 3O`~WqD>6>CI^D 7H9SޒG[-zբw= FLK-RDGSxKi؈@>I$_Ԣǀ9=0"ALjM<(ÁDVHu msXa^lYwoǣO=%9imv䷭|htf1 0X`E69&&Fxq $ #H R7T$TĬP*F᭶&9TLhR /EJ|Ql8wS8x4k_,}U|NY,aK_΁^rجdsY8i! 4Jj5RۀNAF0PTzWG҃#Lrh0@~ F''!^ù-04Oˏg<= 7?z~ E>3)廫)w6o]v=XlAmY۬"*|6ݵnM$:9-h?WYen*$-F>A@(+\'#fr8̟ntwZ37^mn_,!T.`B0\g\|2ٯƆvI8ii}ҫ/[^R.d-F[8b5p-F[bm-F[(^Ee-F[j-]Qj-F[bm-F[!f9)b8DbC),eD9k Jh׾Àp6svm!~XkG^`׎c Tʱd9KmtBpV:I[h?pR6nFTD`u(r8R4K!QtRGΣqQ;C0"J*NAeja$Xl8w#Ch{Mv+q$(_)vb<оoP,]l[3UZ -/?fzȄA[/DHR r{Wȑ /3P<"/~]`<4<"%nSFYGp ]cr3hqFO$XEFB i#Qu3XH {i1N܂ B&2Вhq1c$l:d8qSEi Z 7vxeHK)"g\WQ'[$$ j$Vke"鍕݅_=^ΨwGOSaK6 ͵X%G%CuV5U+NIKv,Q1II"*SsgMJ3Bu"L|FWK80U&E<լT.bZ R(!t(JV83 &OTUKOzZB/zr2E"Wߢx%"BY#,7B .(Tx Pe!f.;aTKeSAG5 2:üӁ薦eY7-ͭ"QN ]qM'6Q?y"u_Ka.+l'ټb]d-w۟^nr1G7yiaﹾ {>W*69^/_.!&q,{ ?])ɵh9E7nFػFKlp.[C3}W,h4^'`WEb:w|m[Kq󛞏H:Zӎ˝{8sw q$4L*Ϳ {ЎN2GV4'm.N '[|xtA{^fobJN!x -|vN]UiEXEHTZ->a/ 9Pލ5NJ ѯVDA*'[NۃHIxnDrh{$qgUK0Nk'SLɚ?԰jW/߿geyIWs|d)]BVRg!mq?N<%^`bt4j4lvj4, ""k ЀB߸dcØV =wZN-کUjVXp#oky01d2 Qt8i)FgI[tn,ۡ]<l:"o5١xdl(BrY0x Rac`K`!d bH'T{3Jn- W*nË 5$#Zw5q4ĭzViGGҎf~]0ҐEJ(?h7! 6UЊ;sԍcRttrXeiEr˷n_y}A[!ޘX."v˅rhM4EJ.݂ CNa+6rJx-#je6X9 )@.2KəR:qwғ"Kj56˨>& ud D4g B9ZpkDĜ=Bh J8bzt"Hulf`N ETd6zyD#"*SllPb-E䎋\㻓_N3?ظ5gjۜN49moҧ[sۑڬɂ=V9VۏUIJ5̙pVH>xRTKFd#HڪQ[S!0< :hU&eDl &͵+TmXm99f-(e 2//XPƑPSJM;ؗXct.f =»]O+_ƥ͚5׾(B~3Zx/XVW{n>)uoWݴOFF!+wn)~Ett vr:8fmM3ÚZn=A Q׍149L5Vzp-+-[t'gٲlY%lWg/c;1.B!1Aa\bYj2js%#0ɺaXvgiDqҸD!zLHDcvEZ!tqp3)ףA?mnvd`5ғ?#=i07dnSlЃ˥лj:jcDm H="S)ϲRg@֤[뭎spTU0TVdyqm Ch,tH_PlۮWmsAKρ>b$d|VӚ Z޸j cS#,(T4H/c<9ʥ!Fkm$$%c1zX}K#toUA;H@A QNߒ5<2V5RЀwrSHxGg i[oOt}yLv]G'_Yu$MKka^< )/[?V:f9$4k@nȺ% KEJ.sV[Y"Wlm^s+*pEZ3Vp6*͜ "w)8Ò`QydFdUZ#%w4n]|XZoHlY |$ 4,:˞Ǔ,H0d)rtF3ǞwNQ%sb -m9xցΚǒ,j  hQ|Ror^Ӳ|4 Sxz>|ƻ޵o|hg]o>LiR֐ؕ=;ES.h(f6ݐ,TK9`>yx4/+RB,)ߗӸ+Wh#=&^v˿TVԩfs‡A 3\<.o7?@{ t1C?["RdN%fT8\4]V9F]1(aDAD4A:i O{蜃Yp¦!Yf9F6)g!$"J\q U!ỷ9'L'N<BMpzs}*^,5%GQ#!u.8VB<,d4ETFh^*WHE@,J[O5@lTpMT`ף}E?%7^BlG F9#њxQ ;2 9sY& q#K@Ŷ׉mJq 2ܐmYffzϷm5~ ߩzQ0R՛WE߃gJ{8_i.Vm}CX_0yZI6}_T5-ųl Q>**2e ] "ƻpCoY[TF?bf^eWfO+JwuM>7ro\Ooy 5C{;^ZH~sY!|MazuWv>(坁)%%rOڻpgv\LlDY5}ی]cD arH,qgr#gdynSeNέ -|sͣ_kR{#i{]80. dg<ⵛ=:-/ ^3Pk 7/ܾ`]j^]`<;9qNWqqQ-ޏombh\_]]zzzwHK?~;qy|+F%Ϭ 3Mcݻ*(S>EgrnWwyz-0., Bo&Obu͕9cgk.mU2X>LՕt =:􉃳O͘}{`vX5Hu^WHW#u`U1w4G(EyAa~[&y 8'vS1Lna9slICIZ+z.蚬*KqZ&v8]Hɴܰnޯptv}Y;2_+M4?kz𤶟5ɩd"j4-a񴡠ժ)$6F%DE,<GS624-j^…8"U(EhUj5X^TcO9[ZqVАtJ3TQ=٤m賴ՂkNQz iϙZ6yh T,nmYiR odiu$MD9T=b.+$"|sQ Dg)b/9%|=;"Z"M4^[dY=ZǬJi lUmMBM":JJ1T0KBȥaUV{l~%B!}tBb@QDȢ}~lG/YRXIGQQ%cNօM,i()susIYU-wTM%s$+F[tΡ(sRBL}d snE);IM -ڒBB+4_+ ҋHSFPK|s;E>I'}Rt0d]&|J'{X@榤Q1!KuXY)dWh!ў$֦&A ݥfѷ#*L3R%=/VQOqѠRiE +1ؽ*؀d@=kk!Om:u`;4V(5 I&jeWɕ8Y@hтsic-A::_#*A65T3[!(P.hqƂ ճFx(E{eh_uU dI>mv2Z"L.5=+()(}Juyիf!_Mr^KY&!E(zМ&tY\D d`{6LB4]k?|p.#h4 xG\)`k/fĥ"ҌY7IcL(QE!htD5!"MfCϰO% tzyJf%SdHW=$V*X( TPz6z/3q=h5'@i4\@2ӪAUY Y0ⴱc4@^BB%>y/d:BG-8C VgES ]_~^jG*:m`^֊ANEO"]%0ZuIBv `赈 (-f)c &4Kf-;l,8 TDhR8f4Ǯ6 3kђƂy0+ !.}$,Z@'J sAn+itV0Mh%QJ"-A)5.-zu42Yn$BG Pl`_睔#:hI]T&ƐZjm0&ܽg6+]qrMK{U눙$KOi @znNNi`$ti#ll% f mi26QUsT$$ѲzhvMfc`Q/GFo9+̈=Xe9p7$%<%:`rh[S|'Ѝ m$Y]fS:( ,3UbF(z DekzD-UcW,B;Nw؊D4cҀ'7$3`g +U 2ݡPhQLj0R܌"2!1t#iϬsPp@ أt%h#*`hAgp `NmSƚv+,4R;k֮Tg jtϤyha3BBvVi:-睃 v)SxZScUUbmQic&X©:lti = LvQ$KFD9+T𚮠:FX2\\\1F* DqXoCn֘M5Qr7tid" J.YxY 4ic#ɨ!:M.P,=wSjI.Z juhD蝩yT]pB[ P GR/Z꤯6kqju%EIQմl(KH-RFq#E{|J |^0 *-AR֒Bu ~)kiG&AM5t!]RYd%t;V?{%t@ +X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@0Tk;J 2ww k{J t@(87V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+>_%PD]R`C@4'J?S dhh@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JT{*hi=2r% +fnV k0F 0+~^J %e%У@H#@>R`?#%oT~sJ5b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@K뭸S~ -5׷~]ZEY]\vZI-c}IkQ'<#2zmo.4}ils\vEFe“FȣT?X%gjSK]]FoͲmtX>J9mvi280Xk4ghN*Wdw;Sv\Z/s/Wd\>rBW?4 ^SPe<_}Ó8` 8\ E#[fa2 b״`3m>E,=K]H{sG]+щztPA+LO_R _Ulr2 Vk'>ov・?&v#{kݚ^we7yڛ+o?j^n6:;\m5]v!FdM:ht骊5's -y˰ 6l]'6ƞem:r;=l߃ ΀+*"ن^n? 6c\5|q;{;t{ɻ’ v;owl6ޝ gjq_Iy:\=I[ErIv`Ke[_)KNQG张>=a2Dk~|߽ D+W); ogoHGeÚTKw9Xʹh-%(>"zE5ܞ/g 0:˄_B,嶺.WA#eۺ?}ܖo_}Ҕ]od1ܿUf:է>wݿ'h@)]&QJбrBW)G?^AW|[GS66s>MڹdiH]CB ]f.Vjgw?!v"(Ӗ(8ِ(6g?Q\!7Ż+Nv?n )gҷj؄ϿLwQb3mߟ(DUU*UǯZZyϚF{ZӴAo~d%f^IWW{ʢ9_Mk%ÏQw'!W,ld R7h,2T8@U?zIeRE;RA<?'l,Y))UmQij!Um&~{ZfeV st晄qB:-*7t\q:2q(_uͼ}ucҥlkZzNs˗.+-KxܒTDA2!kԔy+CFj[\xA2ʚjl6CnǬf6gWv s6w^@'CbDƠOŦ8k>!Κs z\yI03_] /t k+KG?3+O@߳ˬj} B->@OyN5t^I d^ڼϋX>CvZͨݍ?i{}d{9E` Vhk4ǜɲHda=_k&ePM<߾Sm]}M`5Ī;uZ>bc-vwCux?π1V~cRǬMHA@!zi %AI&bWEu."²rW_ Z-c'O.w<$S;coD*+juJ-k1\H=x$cMo^ :cѡCAy5+M6OC~HȤ/4s'5Mn0Ɲ)7ΛprH6WۻB:`%:3؛i([0]BНN 4q8"H꣊ .>̊BCE g -eU sL!M$ɥt(9Ù&g^jK:r ?[oiws^{n>yߴ9S'!fg2=ls&[K[oM}Un2tH Mv&3esxatw~ۨr]ep+E|B>9/lwǩW蘿 їt~5骯I9ΐ { \Pc(q>!&rHf3[>,xJhdNx0@.PӠ2 ur@FycEˀ)CGm+RE Agp$Ԉa[3qz1Cv6/R-lc)oam.nzBNŘ=3vuDH-R(:(LE(&"J2i qSS#($d]ODK d'Rd+ei`P y/tzW<^@S>b,Yj'gXXYjTiw,Rz5 kX#anvV-Hzkdv3슔ЙP\RpUL+1J a4%/.%ޱXTͦ1h܍^p{ҲM2_@䴻 "OC<>׿mv[˱3F_ոo>B\}6ξ~:oU7n.7'VFlΖb |F1'? :zh,fnv/.?=0`/<[puuf` `P<LZ Ӗ%}jav)'ԛLjk< d#kBUL/m.4T-wIZE|tpLqJij1=?eKŽC9Yv6j"-X$jU͢-h9j~L3Iw9c՘/IST8;he_OӖ**D v2cɭlh]_6KPQc<[G)\MOLY7Iʚ>p|Yz5M. Ozz= : 6kN܏"ƽ80fCjso YuQPCadW;ƢшSjb_hBK#=chb|ϩ_~~X\=kRLΗys-*AƜk1JcTS~Phsw>OhOSP=\j (Y"49 9(/WKrX#R6@ ֥F~C%R%hSʠHR6ȸJ[mBE.u3qr=k W;vf?/kHur{v ~u tXТ׷{]C_ܻqԻ}}yhb+ޓ1B.R(ԞAQIdPh Ђy!=PRM5 TEF %d O%JS(ŀIAJkj_3f)gхVƱ4օzЅ'ϐymg]=?,/t0Oƣ/TPJTJv!ŢA):AX蠂S []L-$%4ҪM<9$_+ {5XL&LJ{g$9KfmuW&mҰK:{V) ӱSIQ 6Ԁc1yÎPFX:q )첓^@n2^-۲}3 m@ڀS"S#ؓ8Ы>z|!YLJAkOp} 1%@]t@ u^y) /8t$ w * :i$JnH%,l pNBsfqޔsIM3.fǒ~NfvK+|X;E[ouP[+Tdn-F7<}QOՉv}]O[&Xͼ6rz=lϷr|{&^`y:v{ kw]}Ksu}=:|Ѓ?7,C 6͝ډ4KM &veLE ?vt=[W/=[ȳ=;9NBjiMxq>+(ڱc$O֌G@ƣ9鶽*E $ EH|>TP +'`U)XIֽ*[s@~^o]<7w]U%i0h\]>o<^z?u.~@V9 QӮ")Y&jcbE8$mbVJ$m< e4H٥lL1(K"Xi&΁d@ {^`Hhc:&GLVƊD)NiyuA Mt+zVl_# hI3* l Gm*P Mev/.Kʣq/nsB. cG{H S4˄LSs,c{%p$ML0v9e/8n mc[5#4uV/e:^~ـ|:kJL6'E3Fs&TC&φx1a |OWId  ڨ7=QaQo]K30 K V*Y"AJS"f!A,cLNȥhQz.bJ|PHDllAvl殶^?8ߧ\u\ YznCx+=z*/kvt&v4:]>(I`ΎPr]R, [;٭LߺuQMʖO6_^qR~ NoTr\̌i]͓uw>V7^Λ }b(BS\^v빥vn Ez?^'A;{wnz ChAO=1>Eݐs^,}+3f1^f<>;Ѳ1W:^UaI'?y$ua?pC_H#|~} 0*qS=ŏ_ΰ'Wl+K@W褭(-/+QpY>r_VT$ UJbW.\/û?ޞ_?uoN?}| {_ཌp mL/@w?5oko5U1]6GL#!/wpjW"DX, HqO $^#͢s sl N q FO:?᧋ƾy4H)ʝۅ_ (_RG%S8Ih&8RSAn ca8֑$,+Vs,$|*;^*H !Pu2 JB쐎&bc"kQ Q>jj&J> Ai)!-E AOg1`z+ V&iڝ!xI@m@k';Ujrr\h#LE..'Tv Pf'ߟnڱ3FTR4AFeQ,R@V|Vm3LNS9׎SoP9~ m0${8i:7mԓ<7ceMc%iw;Rj.7w~2;ݢN ]zZCŻ/XyAuu밀X܀66!jIFf)vj;xF;w8v.5iK-cp] qւ(:ie:M/(ܥVpXrhf9YZrZ X姇O;}l`k@sހ< <3cĜE NVFrE׌RM|Z_í~jShn{AC1a+߽,zw[%8[^K+0;1ڤRH{w}}&XtrɒRh@P3m@fL)m9wp*]@8 פ jE1ǒ M0&3{Qnv!MqZvz+v`PF;z}HydX Gg |{sh_> QE>xgq&+ 37*+侈D&.JtWP\iӦ &L썸Jj/ XﺸJTRՋW⑯^"N.V"Xcj5r3;F#VR2(VW"u_=&[Քmr56Kl㎋wSN70 ď9=F~}GuSمs\b:C"j1NJ#)#틔Ơ\R:QpүPJ{$ৰ1% 틸j9.gq T+G (q}UV]WʬNq!MJiQ~W_(}{<5!drV8:QjB^2A1}r`H(`8NUAjRi3PdIDX<1!jNFS:OMϥ$IsuQQ_3RGa9V1d&5M}dor%r:KrS.sד)*+!Gs_n[BlK*Y0R`D$4YgDgCOГlܳH*kPZ{y+3(!Yey4h+=kz C#eY:,Rg2DO"k/%%\ P/-s$k[gku6n-sflX|%niiDމpϿ!w=(:BgX#Q BIm037NxĨ>(m͗TGAGt{0x*# ɯ5fs3c4㘶T"#ؤwL9T#r^tn^2BH:T%Wq0!l#{døޭakVSIKZ1 JZ# 7FE#(vFp*+R95- Rώ!T^cfTA,b$LCCT)l$0,Hd))wR#*#W3 A*!lh2F#l2ױͲe-Yp{Wr=2%>=aQ)4 <1 Z"Lm\YKY~~ou@e)";b쵤m8&6Vi+L 4L-mȸMIÔGk.I;&,mSdI-Wrx3K|.p{۔0-Ӵc{ Ev2CgζH jҰ=ie\\a5 &irya?y'&aL԰zDfIGzk]4Y'"@'}Ӿޏo<)Ša>qd8i6 rCRG^vK`lJa^iGjd~sVk/;^s@j>c]YQ]fٻ6v$W<'o}80=8ؙ}㍏X,o%"[jYv4Uf+Vs6%Iy?KAyVf7_ʽl!vwXnӫ_9z~qsum v7쇠L7grWlXl8nKu$ W/f<..?]O8< ldB a}bjZ".z+Y!IdyA椉 BMÅ(:8/cAxZjEۭsO|Q˲l>"\޴#١ydLh2䢄N2Em"r)(+ R:EċS0`s҆]!JdW<)_2J2d5R4gX~=*{5-m o=܂+&dy4d pK̶{}_^: /;eo ^]ưePC ޥ:_ʟq/ l//_e[{N\'/:iPrDUuRa ͒OL!zoĹ؟K>- ۇ?-q_|nlc64t91@I7;g}eM ѧ`킠ܱ7g(@v0ɝ;б~;ǚCy\kOYC#[h]-km0kc{kU_c?1gkG/C4ZDcY"R $%b\y:Mn}oW}\V"01 !B>{wr)(RփQ!D*xAE-ȶ \P"(-+E ySʨHRz6 zAhRtխف䲇S˺O}t}Oί.o <$ ZUH7;+3aP[*'.X E E%K"$S"SP ^@;>bI7ֻRQx TdLqKt9Yi5Ji ֖8Öqr[lek -ԓ-єw 8+N$99Qţw]tPv("u N9)2Ʀ4#jGC"IAeFE l BĤv$IMfF,(jLItx9i?gN]xyBnmATau;X KxQVDutݐfe7揍O }8d-V3?}=^[eYNӏs9V o̵[W`14W_5t϶U=.<3?XN[nEmG*h۾^ݢk{IBg$]@]װيFl'Flňhl6GuIH-M6"CMpNuYDі %z+Qf}()ǜt۞u  &.h"IC(HkTsRuϺV|v=.Nv;:hёc&~ ؛-o?[zn]lFKiE08EYm)@oɠ4a#imDX(Qkm`L.ȒHXWc` 2ix4R@'Oф $!"rz"EUZd^& 䡮jVlOVh IrT—q+ b *',&I9A{U09M:6&cD1B"좉~)8JToXw~{v[iA//l(QAdꟽvQ*(Ze63E"H6H?YXx2 a8a!K’*"ƈ^ʣT()z5oR[WA:(RJjk&)*Ȑl{q4iU%&04g.d;7w 9{e3'ugmnݳYe[ݪ97<3yh m' =y%eYbmcs2<˫/i2=Mdy~4˃|6ɪd"$VhX,SVȉ1S䄛,OSuO'p|%*`u" <(hfZ?2)}] :qАP/rZ#8,છB6ZAO?!;f|Jfƈ ";rFPfIHd+VdXLf2kbr;&nF#w<-*V9D&"kD|IRPhBT񘃊Zm 3(-+E M62*!8&E焧8*Ojo|λ<l uc=v/'~SUdMi Ҍ \PE/0OLA-x0%ֻRQx TdLqKJQJXHفpr[lek -ԓ-Ӳ1ónTz$'l}& E6Sb%2EjL8?LJVljȲ<\=( jhNAlaOJ-v3q-v8OXv1 ՞<حE8Yw5 y' B)0wV˚+a9 SJS{XuX%0BfPXtvlkRfdqd0b a=_T8l}kMch&8Y*x)u52NT& x6[tD r8N&-t~ҴIFioZ'D F,Qݸ cp>єHvD2'3d7''j cLuYAءa28头b2F \OO"IAeFE l BĤ XM&3mtdq^f5$:`}_紟zmv`.{!hRauO,R}:[n2`*OI}8d-ff~0rz$v[n1\;r򁉗l`14W_5tuYLha2qmG҄ cQrw "bRr=D;HhUa;a3v?يO:V4Bg:{$&8\HhхXgtȚ#룤smɪE A N;-L].DTwmm V}`,{ll îWkTHJ,O I(j$D q8B l ]-r6 MQѠ?bUTw(<-SӀ0:D _588ߚz$Ri#HpWy#%I.$"DAAΰZzT.MlI4Cٵ \($Jpbs2DF!tA!A,+c2$ew$EXI !B";bSV5wYjSywq;nⅲQjS,*wJ$P=zJUti\:%kLdm2, P[Tn-WcOC.g9Y>q F)gGVI)+(^QA$ț靌펌!W-k߰onuC[w3s>;uw`3)ܿxWI_6"yTi֥J- 4D)"ywnq;p$"J-]K<*!f\9}B 'ω4/ H*:k֔z"\ІH;' H Uh[])G"g*'0mk6si2ϳc!<N^ .]E7.:z$t$b,E|lik'Owt™R( Ye!De5}5]%:} n1~MlAQQpA\[3jD  3|`, S㓃샙]v4w*SB y//ȯ"`LfBFc ;}rzNI)=;7ԛLQ {o-I|B|EHth4@yJO8 }4se.i&Rv>:?l0~͞.dzuw^T7Mgeb8!KĜ؏acblEjWgY|pqT%Jږ\ɺffX2AA q8t`żO@OmO:'VզZ]WM:NHn}6 Td>bT\R \Tma?yoO<m~^2C8FIeCd?xSvbD]U{]1 wO/oo/=sD>:͋g~2:3 މ@-4ml7h6ݿ]hMPb; /o.[Q_G|U [=NޢQ9LFJK'.`w|B (r2oOl21|SH}TyPMFֱEkkۯw?l6J I. <a ܙ BDNHR$T{>{16aʾW 7Fg9y?GRUJ0\YCZ?Z cf9ya)̖?S4:۰KH iDbZjwl|*X'Z;,gƩ,Oh-E+h5T*!K5ѐ(B3CMP3P8`Y b%+Pȹ9Tux['uND3<_Scx^672:V\9_Z?'Gf+r/w&Eco "*ʣp[)Eve kNx[ޙ$yڿaoSTjŮxc;12پ}S̲>$4F&ύ?[0elJEJKTn9cΔ*;0 vݡZBt'ZK-:hdDh`OHVr+PSL1'ZZ#1HZAZ/R-}\υG\Z[%3"yh71Nd pC`*2UMЎ֭0 5s6\ BUK˥+}RUnFvP[[;PU !Q/@&.wZJGKW# Ýir8빱1rfnI$)xFdFC {ZJ/'@8/)QR{K1LRR_Iߵ[,)iV|c2q. n:9uD$mI: !܌,Xi3CĩU. }$ZNy4.jgXR@8ĥSc,ms3:trVOq-\Jt+1mUPKns_ʼ$+%O̕||[5qjd8oeL'<,{*\.ycOfXJ>4PGrW򧃍Rb:_M1Drθ8*!mR5{ǽ{*QZg_^tnS"/&m&W/%mgT*e^X hbҏO]'i\s&wJĿN*?=?F]=r.~so61KGm-utqwfݺjǗ ܬtT<TԎڲjm@ʕFضo?>oVQz#Q}& 73ߤXPƌx٨*Q6^X?{Ya,z"+'4 Dkm$G"i%x3P؇`g홗 ^]rId3`fJm]ReIvՙ3d FhͬՊg,mCu;N ;N؊7w 4w 6 ⛽ԥjYT&n>U 3WCpv OzV8 'qT8֛Nq&V8)qwf ko#4[xmK|?/^(^s/th}|mC(/&.]pgP-xz_` Yu_]%FtYFg.ϫ]o _vs2jaL-n(/sӮcL0ՙtyR:̙.]ZH|B8;ww)gʡŔc]%=s3  }6Y\u6YCC?0d/vTǩWvBpBvyUNJ}bp+ծSx v)7rB}>߼p:' D\ ~I#U0oG'7&K9W+kK"#GCۢ~ORԎiv뿉E$6q,`<{׺֭_FGRB[5Ӧ#|Lonڳ3irJF=|IV|I~Il'W>Xx%,?c%KB(}dX;|vf|u!SƐﴝ^ r(%#7ޡ!Ǵ1zk_b&+WYZ%N;GWU} 6p>*7}p=zpD*HV" 埚 _g)k5[`+ BVrm%`!5 naACpL ӎbi)3%w$V$灓,z.%A23g4@4{7AhsU½ow׸?5lh^W8?&0S^ \vkk!N‡{k5Poyczf>FRme!RjΔ$ b}-`I"x%5^3{j-JH,(-8I$_4#x1q]6 &eF2@l5T"=6',籘8<61|)~{=t3~oxVج[>e#o>TIa7{k'l}ZG%A'I8511 KP$hWAhaGȺTiPvh(Xj#υ V[\.itBcHEJμ_Lg=#Ҫ{g_/tIĊ+l:5.?2?% ZpY(`G(-CU%Jm:J^ 8 UNގ|v$=9;rmݚl>ݨ[m[ '8}u]Y_3/ꪁƯkMB.I J#FRׂ0ڜx! )$h&gn>\oJoFL?n9'5h"ڭok[>\MU7ҦR8}UW_^ABD5WSe 8T,JgJ$s^{D8".z`80P)Nbs)H 0I2jgR&D{E(KĥTL$АA3CMP#0 "EB]T@ dQb_R1q6#r߬wg}LPa}|[[b}7YWKwk^j}d2 qƁGY}kK0 I*P{<,H/,8P큧 '_B=+GVR1kyʋ !X"h z%2hZAGz+)X]G &"E-, 6㿅R  6zިc5޾q& :F#ڈ Y$r!Ót E XZDDuID.z/Ͼ.Q\Cdj)at-].Rۦ oꢷb  sޱhϧm69 6Zp3➱ vcȡ*`:khpAb"&Θ < F YJ6׭XOjY{>o\t ap%.!`85oꫦd+%:sk-j̢Ye~[ԭ\|A-PfVP%h͟:m=:R\)r0J5ϺB%2m .2&Tu@D墶 @8e=ڙK==fJ۝yE 1 @Z$48R ͸rIOH#eY|1sLO?D=N+TlK9N9;E1Zy.5 R1.$=IqsёؑF 56~eE]@d|t%"R:::#to:qm4qeٸDFtTR$@.h[y#8Ÿ@t"8-fh)q6hnq$iy8bd^8h\n'wP/N&o86|ili|y\EaKM9@VFASd9evEDQE="O_yӢh4%D*@8\&IEyPR!(Q{4ŭˆXL͈{aVSu[%*OvtyؐISFY|X=BG.(>H}FB6y0hbLrbOB= ۆԳpQh|NK8Cx 5$'.8,i?)BHjE%d)=qgDyRMŇ6 /!Ux!.2 m>)[ݶQ5<3xZ{|/ͥן-}t~f5;hps{7M ~/&7$}g~88xML'zſvzAhsh+6-q iU I$b+Kl((;ÎO)/tX?F];b)%a@&C FSE"\sj=8[RRtg6mvC.ܽrNG!׊֤|U76mS~ӏ(S6I$ͭuDr)v7bEϏK$rzkݜ]|ا'v`-=p~MKs;twvK~.3NTX[n+8D'?^5}mO|h0pTWJ%ZimY$ [䒛l Ye{9[r* rlR L@h8*rcQC}>xQc;W(DqŜi80SA=#`A bt8sr}7nCpEI0$Copl+v7nkxp5K,VzQHpWy#!)K\&@oĕaR0ʖ"y.4Cٵ  iY|\D"F!tA!t,W1q::^R^SNGSYENCBwSHV5wYkSywqqul$bG[g| ThOuRsw?Z'Dp#iM`e6b;U0:vJ:|Y.K2BQ>k:pN V{?ܯ() p9ױѱIƩ2?uFJw*kA;H@45K=_YVxVZ84~<S]WP9C) <1T͜}"7Rb:ߢE_2(A UR"; D锨ăCx }=W\qoΫ'9>G~hm15tٻ6dWl㑺"Xxlll ,e 6)R!)S=E"G) {UuuU Izdާq)ZĦ/xU0*sUzVRQ4c)kު@o ۵mˮ% ?z឵-[4-z @3v?9vĞgu?ʨ 9^V^z[IH)fڌnjb5DXmvRB^Ǔ,d#< P1kҗ*n `l沥|n*ogSsSGK˕Otd|!Χ`=c&,;ˎ` ruKQbj9! z͠bZ%5B.J {Y_o, l$jާ~T{h3@n֕-&/ĒTפQ-w\HD+|bٕG4.qotV}S|j9يJ՜#Ε9MLn/aF6|aEp511S{&k rmEjuן/n/KƖ9_:ěG:]6X> ȥ,2[ĘZG U,|}-riw|$[:`:Qp QGe d > C)#~;Fj.%08|E?&iೢuL/u b c'ҡ5 O'㺣άgڵJ0}Tq&S';{/䇗^P}q7>y.˿Kp#=+EYyz[dDDɐ7k^?i2ZKS(ؕT++Fz@h%) \t6+:thdK7O7޾7p&GkSk|1w >yʌe]*q ]/9|Rkd+InEfX<: nH;fnxKgN:;a "dI5DxFS#32,ȉYPh&, Ik@ VKfII|,4,g^Nn)QDM/θh +Hy$Z  mEIx,?U+G8/53Bq&\)𾜶:oLQxGC[ ^'~+ nY*!# @Eȹ5U3cE􌢟K9űSe#,[*f {^+B FX\Y<"ka$Zik."B4Կ(F;#|MȺٹăN!VrޣBIޝG0H:F 3`8-H0e+͉{ct4!JhX:S a %Rշ{^>HYBBl0&xoeg٤Gݳq96\% +PF'U0*阯b{vy3fh$gv]ݓ P̺̬dGBQL6K6GTL ݘ`֥2Y[%WYf%%SzMVA 6vN։W#̳QDyD K7_qdXo7l[PGk8~Rdb B-0'. a/o8??֖[Bv癖K/Nl9w_ mFP[\_߼ڌZvV4o3*iLrQSeeO4`-g]B>o z't u>z׉LOvM:*Z~$\@1Y<kUI׺a.ueRuh3_fjv3O=:/ySV*wߟO Su2{=R 0v;9gEW^XiĽyjWlэV/|QB*m0+TgBobl}M|6-mgc[0,"1޺2q{2skdskؽ^2ƒA_6`͠xG53:tBcAv?$2 @kY#QU3?gn3Xc $WG 1bjf{Vfo2Ԉ\`v_2 OyZT3 %{`ި+Á+/ꊨV"*᭺zr>B{JS7PBhQWfGo`՝[ vuu7r<՝D&vK];+Ӫu=K{p[#T:WSV;/buuns0g,%$o?CiR?_|.K>=M~T.N*zՄ7Fzgb1ǫ-ೃ'5O|ե;zy%݀@-j[ݔ+j/Pk>?I'nOsGd4w{=ݞnOs79jcxx^nOsP=B4w{=ݞnOssSi8Ĭo5,U @D% &T*zf9[ڐu#;֌Y Z>@CT]!A!T>Kp#=8yv^B̹7:fLt&~kCʧdӆd_9-ł l7胻v]_%ۋQS{4OBk,)%ȍ)}@$)8sN nH;fnxKgNlg ~bb(0DZ#H$gd?lT2,ȉP,$"N[*AZ2MJ(0ga9k嬖/$fr=1u6FېUD@:Ԅbm+1@2 ' @ryI*0%ƆÅO^. O zK͌P\iI9{ǣ@ Ojf7V<(P1ƽOj ^FQ%d!"9Fˆ䟑$+ϤgDVq48Fi|~ڨso4<;ꬅ I B3ydCo#U4 /ɔ>[9hc52D"ZeW_c+mEDaJ?AzУXd]  IwF!BwQ0&}<N Lcƪ08++aO1z>*5ޑo ȏ77қk QNf m%X O6=zeƃQl\rFѺa%{}@Jd@%w^T,wҶ޵4ű,Bxe/ZGnrvx{FfU` 3HN恆Ѽ40։#]TwW~{Ca+@T1 gY ^FBȖTª9vbmq6$Tt-YA]xQbxqg-a*;ea*O+*^K8#; !yR>oI^UXkFv kpZ qre$A!AHzm^դ1ib]PN+zQU&Q(*-@.R(8ZKU]J I:5ҁ.ٚ1&Aݖ &ޜRJ6&7͖VvzŸ>qGpW<σ6\/""}fUg9> TeCFe; ED7 o]Av[bV1ӎ`ZZ;oGy9n<<}nzaWq/ۯ׵#pt&a3UeHc48@79G9;fWE Agwg&lvēR?{- "V>_qZ[NƚmZg%Xi0TŖ /LNPamd6aK1tmW9a/xrv'x/_xtv!pN~w9N&7-@ #'hTGUfU.,Gr֛#E%~7ԊG}Pc y(;*(~k»iVOv[ukJqm tJ2_0_c^D~dIQs3 {1<(}&dTk,d82Vc}̗\=RG#TkZ!g)@z#ǥ7|Ǘ_37*\RPM3zYu.)Gg VmUJU%/Sa7mͳͰCFfeEruX|[ csD.HW0X1r{ρ!)@ʈdm!C5 1dBKܪRMG"O=  [/ޚ%_ lى/&AiOo0nS:bUFP\iI :jPFAY؎E%ג_)8AEώx#^\@w0/euKOr--Nf\ $ڱŁ2z2l231/w 3-h1'N{?JLƔÄw1cηIΦhv|Вp,>p6/dpE_#<_Ǫ~}409 zy,&eXو:%@yb8=1&~~Nƽ<ݐj!ή" "Š$$t2#Gzmz>Tcn{?Ox{mg^xX3s~Ϙ6\^orp~n2,@Ju &h՘B\v O2ITTdm(Qh5)Ebw>l]ԷH`𮚨"E[K%#Q&0YL| ;3n90aͧrrw0+}!挷}>D|k22+Wj@U#FjH#+OٺhDdu3m K1e 'kƥ dJH tn903%t$0 2)r)i|P͓J`BElI Q(&2yc#)B6Wv`XB6,ܱ37+t"YNJYd( *?RrƩhb5nmi5'WNkGlc ӇIedL )Hx%\\mA-CEҠb-T2vB2v˹͝*ӿ}`sߣܷm5ch|#sAkg9Ԑ]~6cѤ1  :V1~ObG._;Z ASuтDcaU8DmPQ U5(:X7^ŭK=g0g~2{wߖ/{>iIp#Oֶܕn'ޡ6S0vhZ3ڦλ2x= ?9?ADIeD)*@P5JMLjUC-R39N?̑r8<ښ|iyŸ_Vk~9RTF9[5G&LZQYc|gR\ɐU- B2*y^̮F+;ݾ+&@"7Xgc ͤnk9Eog Qv= t' LY _E{W}r 9E+je(]2 a,%V6l@pZ>ZJ%RѡTEsхR\bmW\vU@72v[1;Yʛ`aq,Bg,#xԼou⁹Ӈi^T6?o/w9bWr%εs6yɫLܚX&@46 (Kdii}^%LUԽf3¤T3bw[~ĦIYQ.OSAmQ{d;tC1U͂ b`!(+kFR\+V2 ! 3Բ,Ζ`Ma G(R,J5onُ3~Y  bq,"ΈFDqw/LPDQ)ԌtVDoe_Ң%Q\-V1"bZ)@tcv`u>h[pD"g嘳hŠ!ʽs-g?"_j\.؜.9Cg\#.ӢrVժ0legM`D3F9EWDrF\|\<{gq*xޅմ7FoQo_ԕl&( WT*׬%( `$qH64Xت^0Fi٦Vv竁'eFc|;~ hj1;R@)T(5_y);d.u}E|:vZ̈,֨2y@✨RQvOSOZQrbU,gfNRd<,91j@1)V[}k NM'_y[> ϓB, [;c*tRDMZML+v։GC9'тIz%/=؈ho| |ѡkvGCUԲHejpڎY PsQOlϮ'hqߋTuZL06|"rwg7]|˂$g\60|5a~f}~6,lOU̖osB7-*lMliO:2ЌS2VO_է6锿ç靀O|ŵ0\RIɅ,o^r˃^jpGi0缳%POtՌnFBg3gRDLIQ4eJ>.zzj8Թyտ'[lsNvWˍ\:- i#aEoX};%ʃ#ɇ;ki0} ~:γ1.Ϙ=%d+ RT&ʀhrs5O 0ѵJ3;ttULux~yח$?_O_~?~~{Z De"-WDb"xϏ?=iUijoٴKӮfCRrG{.kWB%~"L݋h#g /~ j&98+8>qIF܁: A^%@"ȜBfT87~z/$ͽG߰Qe gkXds` V0i 8q7Y"aӪ2s `BȔʕ-AJr@/99g428w2ޱ!4nOŷڜ˷lU@o ^r:uzT>oEr8L'L%~.--5r)S m B]LSn8ҳ}:'>nERoގ]:,j>i0D)WFk"O28L9sYǸV%Ec\xI{2[i0U/x0L-y76 r}݇K}׫_ǿ'yϸmzVHf/ם{MÕS[_?ߕ bͽ땝WSz!Χʖa[χ S#Q=MFp/48]UoG \Yg/nˡ5RQ HcnsgUѻixuD`kz,1^۫"5ݜZ[hUyL8LˀΨ =4E\RHة)R[,AccH`ΧbVWs"SW$ಇ7WNqMpE;q>쪈MHkU҉"\imY,G kFd6{kv:xGEeJ'tsp;-!Ddu;J18>>K`Ύ '.?2>OZ8/|Ā=h]U+\ֶ 9ҿ o /r0Xvq|aL}P`_>y=,NQ>]1^oo$gE`gE\Ɯ:L)HQosnΆy4S?0:~\NG۰w4P9ߝOƩӌ):mSJz_/Fկ{^Dw':W!)>6C2^+`#CFkkῴTšB_tG~޲[򖟏4(CDa㌂&Z+JNW/zl;o..>/XqJ/?j~.; OK虚{7+ou*P;%uK{K3(clóU>>G&>K2e-ŎٍاJqhcØV =ÞLͥ%*t2DTS٦0EO¨.j .=-cmbLmS+7veyRp&[?9jdĕUɓV9 جV7JNM0#ߨOΎd/_]3CyfcPnv$Std &]J{͜CD͹dh&zvs(޳R?F4'^0x=[޷ق}Z VVvyäao'巳ùZIGlB`h#2*hCC()mA2Ӹ lFYcp Yv[s2qf2*.UL 9?1qH& A^X +HȞ$20Tg;0FFW˾)1x[gz\GՂtm66tjGG4t,Sy 860H{"DlTc ҶRSx&7K졧zzym5|)JEu9$aN൰ & C,dy"Ey O>ʼx9C#xd8{& /)9'^5Q6"z)X Z&M}D!#uCCK±h}ijN]2FG5W/嗋[.yx~³K@W}x &K@{$GC4)j>Gk [-{X'NO\^߹fͽ ox1g:(HhRȁFK&q/LZRhgUK0Fރ^+30ff/D$O%&s^T򏼨#@V'UgP*]{x#s `WY<}HwOGϣ*8;ۋ"㒫X2F8dDӤw`5%2A$I} }1nbciDy8i\#<`ELqD:0 Z 'vbZlqt;) ڎ~dH/av+srKz{MEѻeu(HXO)Ƴ,Ye5z<''!Je^2(4P!@"O)UҘra@xF*J5qv yLAzM9M1/^bPqts U+Cњ (A ͎-Bd_A.LLYWБ#&,~5YaM>;gdvj{U;1ױS1#,(ށxLDƙFkm$ PJ ut&Ų1̏j]{VFh_*6#mBI|Z ?h M6.6J]jaƫDT㼵 SMH7A\.p[loO򣐥!iCVV2Bc!2Y2FMfUM{GgtBIi{c9--7~$R,3lM[?_jG[+K{uqёؑ֝Stحe<ڻE[k)kڻBCֻGmx||htb#HFj ͌}*cIUV0f#m~M,eug*OƏFϣt;3sh1me f1hT1K+4`ٞu[Z<&ϣ`2!EMM$)̽ &%p:bʈ]M&c\3.O͎SAmQgS+lX"KM"=5 9Xt+t>L\Ɨ2d!3$V2Ld#$lD 0+dd|rَQLQTUeDT="9_)s-`V9e9Z-:+ɜKdJ% ZWEDg9cI.k%L Utdc$KZR;֎ݫ&vD|U2)uV%⢩7h%bdCșy@PAL Ƞ:2T=.Of8<4 lK~yvܺx'j"n{o{ۂ/+mHЗMGeXxlll7Xg_H|-!R)JJ5 3~#1j"y [oM zVXr g\6={;D%",c>E.1P^Xr&1TGV!#l^ւ Y;0~XpZ?2{ˣ'n_ Rqy"]u ĎI sxLZ 5YѼVNֱ˛(3}ܝp0XۋLs~Џ<'\͊-W6 Ԝvzd/˙׭ҼUtǟ0)uZ|s@P~]O~mt:ȓ P t1^%?%nL-U]O*?ܽ'<ς|WfLe'mT9=@("w)c*ȌVɪl3'waxo,<_{'T#e1h hY@H)Q;΂t)C"GgdoLziP,uh(#a05q![*:k =)}`QKU4j->x}s$7+Zqp !`Ȣ:Nq9\%YWR681:VW @3ߎPp-}.EZU !R )H(]6VV{7›Rk8ͨ!ozO߳;8!? ^o#>7WˎwjZ-8D$d,xRn*0:Q`Fۻ2R=eҿTxMI:Wɲ5*[-9esǸdb-%qߏFxz_GdOɵwzBt1rDfq57/4 x8<,E|< Swީ;8/3_(Ⱦ6Ofŕ]{]׻iff]Vܖ%g>Mo/یk]WǓ^σH9V˱(cߟ n/%-qΗN5#7#ai3=)bQ4eJ>Lztf{s*V\ҾZnQ9M|re"pXn$p^rUSMS{ӦuvMz2k+e]^߯K.\iM% q? l_r~6 NGZDw#5]lNU;;T?\݁.;.CWhdb)2Vh~fЏ F8;uYƠxE }ҐBXtMP9(n>Vݥi(reKШRȁstb*p$ޱC_JWcs^y_~ }ݿX&GQFN ~-b6Q$ڭ"0pZkLV>UVZ蕫҉>j Ke*X VAWeJ:[Zm^?gQsl҉uʆގ.=sr #њlQ ;2 Is LZIG ˠ܄3fAbWxOT[ޠs,hMHXCNgikG{u__S )w [&ˋ-_f{ }j~IX"f^-\o8~6xRl2lX[HPK#Riq4:ގ$pDq2TyE -r>J0|NQ}Snq"c֪ʶ@N@oɴ^Y^oskn]{}m#!-\ƑV*/wic j IP:o+/ub6-oI`ĶE2wdbU A)sTӅ$̕eVkoMԘx&J:)-_ 5[B).Q:eB)Z2p!kԚiݚoEG`e# R bkwo6Vp]i;NI)"<'%LdNHFֳz\g_FT{/Ƹn /ol_@_?vNn]|o]y6,?L7-ymX~:Ig+'/jz3>^9Gc]gLu1NtC^ $l|\ g %Yg._aZ7fBIO! #SLʊK)*@ 6V:0)rj)|nOr%SgA]cY Ǡ9Xy`e}1l(AˬMަfM 1tڢah#9пtYz8]y͸uK oPtCT.[;x43cـVCF%-[ROŀK P*e/3cZmLs+B5ripg ]crE&{7RMSNӣ0F:+H%m`~jcThThA)c&2\  1 Zl:dɆ1qԳKRVLT+!ԗREθ4NҏI$I 3Z^P#Z-U6|U2'z ;T(}ì\Ŝ'g$MPH= s>XDQe {j5yZ?Jvd+LH+rN18wnr"4:y&;R|76\nlN|#9!<^gVB"dPIr gub)Thy`c3kjWx7o]C=xdI*ȦwF+Ԋ!D)(!,Hi@-ZhN=f`lVnr_ kxF[^n[fe߾H0i@BHq$ldJZv=;/!h\pXACIFyNWZYg >etd\q YGܧP6   ZWbeR"=Ht:ZCQ'KmA#9W.ܳ.x5yFEuDG4[h>bX&ĭ ;7/-y$㜖zkgiqDNjV :b+@/ 4gOTjA& =b ؃ L[ϪTJIzb.D'dC (8n KS% %XKHAhMI7+MĹMvr\R*}] c 3|v|@f$N;ZPvGMr iްښH3zHKΌU6-yW(bi_IʅĖоҫgL;Ǜ:qԭ[Q޲ tVm2ݶ<^>2,w)!EEf)9S9x'=0ݹZߟKuudff;%> Α&FYf2m01gϤڂ΅E͛ݷf`NI)姄^DRIkR0eK}YOhXoWӍÒ_vUyr/VE9,/g2SR׷:yK,ySO*`4KB_* Zh& {3%@qVH>xR,Iy G|NhښRYi)KNJKȘI<&3xIs-= m+lgv3sY-Po;c'mӋ:~IĎȉ۞MYtHE>sf\o<.BYYB w _ZK%\NLş6h`08?}k\́sS $K1 NiR$hVX0➂,dIPOADQMDLFd2ҡt:%Ţsi"~Ft \֮e; v}lRS<3 XRB B&z'AaP01U9qqV2YhH$$(Pبp$EJ,Fk犬FkE1FjDUY#N#vqǃcN$+SV9es\Xj*'8G*HP Z׽f9cHXn dJV3DJԨ1Ҁ7\F}9*8FzՁ⤙ >:qɶzT֋Ӌ^\BbdI3NEPJ.7L,qJ:7Zܱ/lwAd+/_N#7*x=Q6v{.z8ёԸ&Knkd,7gYGùf]7.(i E2'|\c43Rya ӄ 1OGAeS^΂ 0S⚽ "z8~ZXmg]׳ʯ\Kڤd j4gB.Ac.6ꛜԺKfz}=._sjRˆZ=xwnyo|gw6Z07>՜7B?s='|qݘjX ֜OtӪ[+2B_yJYďmʮm {:lL9mdul~#V3ϛb\ޭt 62y8Cd[cACH1+}ȖUBV{uۉq CYBsR%Ap4.|K#0AUYvK"3`J@ƁqA:hKrHZd$xe] Br l4k vl[ Q :jc60 Gd*eYtVIY»3Qy.TΦt ŵ-K %!Xe4xP !sr,>FΆ&H')#)Tn!yÂMA9ǙґDo>0')  $Et1h ;26ikJ4sB焚sULGd^@Gn "X6G8f5BYN+@Er2F3NId@vəT.uKo#W\2d01ed_emgP|w YjzٓYggGHR O5GDU \zo\TpMD{`d}:굵dxdj%cmRVz)} `tI-! Bgd!Q;΂(I,E@?:&J pth(Qb0mq>˥:^`,jPEԢB(PyW9"@֍r͡7,\B>bu~{Cz^e̷dJ]/KӳJ Wv<LFhyU +K)r4{KsM7#ں~>y>xmbkĜOa􏎧k jm9Ն?_7~X_KV9_9Twtjz+,HG,Xvpp9_՘IrT֏:QWrQ'2RV>cbB+G%?Nf~?.4N-i1 ɟ~oC~xwȅ=|7~G{轜c,dH$(5$ j M͇+7՜l7W6xZbɆXm[#-[;8սis衶Ӣ 'sc `p؟ gZX̩ĬYO߈VOA]ְhypDZiBBZt t&(9T<9e|r\6'< ߞM9 ^4df`.dNu9tb*puhXwUjg[Ol[}qsRnoB?(x5mezx+JJbtL1F66:R"ȩ>ط*%[ٖ|1EuB1(6k2$gg 6l(AKX1{s^<~2Dmc,ը9hb(ad^s9Wt>r\KK?w޾B_LR?[劢eS|˦d2Qr^-bL0R6XHc,NrInCV͸\6$ .s19.E&{7 &iFHg OM@8T3ڐ=B;Ɖ[YP&, hccWKfE4VrV lvq5J'2d ג9@R$=552QȀr_ zT>Hì\Ŝ_R1|(- s>X'1ʒ:SNpf|aҨi@Ds)Ź\*)g$ ։3M*e 2̹SU^U㉯l pdVB"d/JB*p6Zg1xdlՎx(:i!J?reZ,7Wx(H:#4J{FQ%J6XN m)}ىG=Xcژ 8d[C|NЋIX.}IMɢW%dqbYsղ ޛpݛ`gy'.2-E>OG eD\~+z?~8ja4qTkn䢥L3 (4`+gD|ԆzיE胲Eںѳ$ttZv..bY:tIϮ ouL]UG_޴W'I_ioӟs~1N {ˏerbIMo@KGػ]G%[n/!/0G0+ CS`6=4o4gЩa`6T=5fcOTYEyuT<,Ւ'`$ I,ؠIrH,6 %ϠcJ`fRBZKƀ p@h//MJb ژ9c6҉~%TnK_ 9IN6CD,tۍOwMgU wmP\P3IN! ˨$^k:1 V'*;_7M{"z{N:eTN%+vmy&w Ve6Ӽac6DߜS(sf1VB q!J wPf.MrvrzKZed1H,,y&`^\9BiW`\x<{d)5O-١p9VFUFV;pv[<du >dF)2<ѣR)89}p.YtLo>/J i˰={l݃r{Zh'{/O6}/E{~z~8:q2ș \l!*! 0솤s(PC4`rd2$YC-.t!%jcjYkUF!O#}gC]->SbxGyzۻY$m]?:yxx,pY&t@{<7XEzPaEx&EY@:3.bӐ!bR .hD4A$#0± YH1ÐH>)6䩂<˼?_?tjp1r{G)P@2::.F")甑D4f7Cjf5C!C,RT۳ H  + ^ xi1iIܝ}p>.3VDbGfx(+$A8:Y'٪,UGgO@kʔT!dJ2%} xrW]a=  g&`0H2__Фե}?aǗ/C2d|5za)V۝.Gyls;]N5e@G.RmZ=WwtFm~ƻ豄OM.9-?i?wSwa--C0֋!ζ/<䥵sБ@E2hGcPIB3BŶQe+bzb3T:?p.G91L-&KdĨst%6H0Q"deJΑA{K+ ^+3y7MS67~rÏ+“ǪwDQ*ߦ7J^Xw4x #zė8]E/B]k |Tlr=R 5,dV tK%>Xlk-)V4o㧒4'rQw>{ـjk?\Rn&eUv{.7_6J& fd 4>Q3$~!}0ҧs~}oveS"\ zlkOYC_ns%]C^=-.-=6,>~~t0Û/jyup_n&|? ~ӟs! ;]E*[f*ꯄ˓qq\4CSxh M4kܛCSxh  CSxhCSxh M!OSxh M)<4CSxh M)<4CSxh M)<4Ch M)<4CSxh M)<4Z[nYGWݢó w4&gYvw:pשAdǓNIKep-,)ؾye9VۿQȸ:iqa}dD$xspCI~>IGdSHQ-ŲD2+=DK#C,iN": ) ^TV즪8]xw_/({_Frt^¼ZZ-o\Zu^I`Xdez@Aǂ1[`YiLl"gPW^a5ڑd9 TdpU N&&&=TΞRf)c) 0f%Q)i\Ezy%,U'Cl(c0BDc٤_AM uYm9c{:1d]ܦ +2}hcN8)B֑Әi@5ʃnNƈPP}1Ũs8`1hP+*J*MSe~Cw\tnĶ.͠6ہGS'[ih<.(z>Ր%t"O\]H`VN'+h%%{-vבӑcPQ%LI%LV)儴3qL9rXDCnӂՕ8Zg 5ӖrZZrdutԲN9Gx{{P~#__fHt2 xyi Hw<<#HSƜ'`U,ƊThYxewrNIys:k1ĜCB&A2n*|ԒQ!H(=#Y0ڔ %pS !ϕJ\d9^4c0 cO)`F!߄߾0^t#1tzet ˝r&0Ni},E M0ޤy! &y[L cR R-NgmϮpv<-M|>ݩV'!@u4%6ls_tT>oqd;prjoH{)\w=ҒLnRDURLT*^pAEbD]h MJZH.@ntl1Z\̒ѩ1 "rIHpCtk@62VQcոJ5,63BU e£X%n 3ސ2ا>Mqß7`8/ؙy$[-ǜq gU0 aR*N~ɬ1g`2!EMi,HeʮP>)o21)UFjٍan=ozo\CcZa2ooyub1 y`wt!]Gk5ߏ v[M-꾪_B~?e=替o38e\z+mŌ<ʠL )KkA%?m6#K~˸zfydz".-Tnާ^x>z(,K]ekY'8K02(Jxf} 8 !1ktt~9W-{e;Ud6= -*;hev QPQ2 b=,XKQ" AË:~>KQXZfT /Ww?I_tw 6^6M.nt_B&aW=~c`ðZpWW5UZ~1!DݝN6~g/N5d"ׅu,huQu:PDcU'FH˜S ҫ$߼􅗯2: Jfg2A9i,Š1\Bb "Fj6 璟K?MJ蟵cuG? ާw WE wjQo>7C9"rE3@{5I], s9MIL{J(JyM៊+d1m^])?_|y oZaqvhϡ.A_MQfN(ר;|]+QLU">b3<] >pͻG㻣o?Q8<5¿sDjw Mok\ß1kho<4hЦY|qqye\tgBB4bՀ6n\n$nvŭ>4+1=}k ~w;Y[uHĈ@a/ l}{R%NФ Ά+og^ Ƀs": d Ĝ  >ztd JԿmRd"UCJA09$DnGLr2N KʔR@@&%!KZ^BmTxb]aX5 [:G' s1`o4VPO*~mrFuVR 9zap*kH)\s̯Jt™2172bS~~9SqY{-dUҊjjs{8q^iIu5%W+έ ֵioՖ{gV`f#@'>p6ލG޷?] (&!RWDtɣ)-"c h HXU$BjcM",v=Hq* I.hyNӮ,8xC*w!ojR `RXxQ1x`rj} 6rA{#̪}w/\y=TKv^v*0QWkη޽vc>;&2wG.GcLhM*OVVJi-3zΌ"#dwF rf/#L(jP6m<##v+Yбr\A2z+Jt}X$qeb$'W'W\٢"-{yRKȻ1 I##uI>J_䇮?:ݰ8K2VxL =w!z1 ^}"*{ cԜۗϗuD%TF" BUK ҕ>NnF6P%B T}\ /!`M 8k-#DcAH6ܴ6Μa8z .G1bRbjnkö6 lKL2^s/mRW`F]eruet.*SLdGOHj L=UW$W'UWV'QWldyu5QW.#DJC ޽\oFa[3Tlt5FM5LE VF]er%u5r-Qbu6+$p5*+նLx'$4+_zjOYh7V^?%ɹr'ӲПI/^v\ۣWL qD,HY&JmuZᝡʩ4كYnv֘Bmdɺbe:-_4 ~}g ˿U?9Yf;L76PP>EdA(+me)1Hptt%@vUVϜ>|dwQ*޻z;h4[W; skjXu4ݮ:U5xj4{س,?Xjݔ J(.>RH[F^cg;~_ZzRQ͚&IaoR؛&IaoR؛NR1an| f&z[˙Z7=l^d|YA#{PGK/ep0N[ >8߅fkmbWeqȢ&S "$9\F'J%U$8> ETԹyաpT[IgӤ>v1Qh-!ZPsfNA$s֥%Ԡj=Jɭ7^7r,i4CQE 1),HzHvTJ3wrG\:s+$)SAMo<[GfIM.FecWTVтB2h0_~#"kYdn͠y~G]_缔b|{qˉjZ;ヨxݗb?2BR&a"AMKEO)@NiJ&peFk:.Nj.!g1^|T}t櫣wK{C_|Mg? {|R }UWx:U=̷r^)}ztոf-.a6_-moV:@TkѾw,0 ml|ZxkWYt& }A+P AC3xA#P,%Nl"D[ 4>Q42N||Nlyrfށ lѡJ||O=P2͒?2IQ*S=>[;xik(o5'/gb&ɒ PlӵI#HV:_#w m#@Qk4n'D6K-xŠǛLk^3W.)S@ ie )lI%-ˆG]RTTQAߧ9İBJDdM?]\z ʬ̾zYǛcc:ߔ;OkȉtjJTNhw4e=-ۣ7E/W-)ߌXӯd|vzJ՝Oxr';q3FYhؔ+y+'5\sr:{w_- 9'iBǹqǬ+ϒG~֨;g=i4KA>^>ϏcZ-F>:iƿhF9ev] x{S3Ӻ(߃j^;v e9.܎bV ' âu3΋w| 7[3횭[!`]8w5qEn? 75ja(/kJշ7K<4λ_Yuljd4L[g4>K~6}}Ng?*bONfZ^s؝gaⅬG[_GM_/Sނ;]wUˋ-ug536|޼?xs>Ք-Dó +t9׿]!^bfZ/74}NƗ+,xo7dhk*u~`zo27^7/mqKtRW2q+8^c e_*89axzn,+O).6]~dm!8oJ:>-4oΧ;)#į<`4.M|rKXG߮ MwŔ!9O"fC#&$:֦׼gwMi:>F '*@)2z<|eλcI 1$;sٍs Z7IK[&1.f*n:gST'7ҩ~杶s%}8C a|g>޼RbG&cPGOX#5FL-pLr> ZMҀ,ɑbW(AxHH3Ob!N,HR(}:p(ʁt<-~xcY~躒 S좯yA6ܿYC\x.:tPpJ P!Xa6q\[Lxz7CU< :C躀)v%2mWxA7}k\xiWѨ˯.{AURK&E+Ƹ4I:dQ3鶵*E 0O$Wx>h 2HXɐEYE`1kU"g;T]&ui>kVQwvţ_)9Mċ0Kz{9ڇ?wz|;ŵfty:[_Pl$YX*EQNb08B ,JtmŤ-% g?QV6x| XVcdÑ) A 4+E03XXT ):Ef") 6W8/tK.ʗۛYeN &)}*~$xT&,$V:,*1yddb )+'"2!)DTǙF΢AH,_I#5addlSu*򓟵}}@>Vnp rzdAoXݿY&C= =̖T3F.f.:c TCM߽u|v_G?YBD.Dѐ\,QEgo}Ĕ c^Ac3m5=Y&g"I'Z |Bj_X[qYqk?G:+͎p6Ӊe8BGS߾刿>!wefFwTd0,uAĠuQ١XQ`@EO@x8Sp@ |eKX_RRb JgYP n(ˁfdk l6>g٢ dȹՑ1Y"/RX"Ҝ5QeR4!?I|a bGJ3UʥJ90VS0Hmo6%ұe{?[Y?QFd&x-]Fh)jtZ]P=I[ϰ_\JHo3>7W7U{ګu>\]̓>oq[p钵$;gl&"4Շ:5BJ&+'=l# 9F*Db:}v>Z6#g>eJzUǾ6ֈvЈF\ \E`(*!PfF&a 6bdsbUk"ź@ a8[e(KMިRLdQ]v )%I*-ٮe^zq,uHf\^7z2ڔKlRUV*WlVDFG'ؔ%zI;[qǡC>|*fGn~pyECBя!HOxur-?),D EL]M.^y9o19~a9sHR*z7c+AHk)@dҶ͂Q[9R<}] o㶖+B9|? wf;cЙv"rG{(YD;5QEÏ>êyAw]Ldzw:Њ]<6AZg`6 :‡qzAֻ5\GkֳyLeε7$nȯ|z{ܯ!km YojwsdZoĠͲeu/Khyuݨ7=˿)=:qlsƸ,{P⃁. oI9XĀϣN֐ |eo)4E#pvw?e0? '\}߼~ZO,7BIRf~Ddy[ |f ʉqwn rz!:<^M}Hj-7-FYwUvn] [֫T}/0DEœ7,iTH|2ۇ^G1P(ڋ 돖M/id*,\4d'>׻w=[C^P˦;cڦ;8a"NR&wEwݣm+䫖> f|Ӆ/K׹*IZo9%gt=]Ac"&ARx\ RXv{mv S\$Nq=OJλk[^5tJfFJ*d (: `4%,q 2%#ǐGb%Mn%uI))<2 03,Uq l(/(#@2q{`x,3?݉z`]kS7'u&UcI}B<恧ri. I`}R: ߿۶m `2O``A47N&B_ↇ $IaPâTrLt{:vՇ+{5 k֬n>(K*艹&L%8J惡+\-DGY}}U>L}.\B0cc?3W].Օs9חqPȯG;.]? , Kƨw&ڞNE2қ΀/z~ųJ o@)w_&-.-Zz%\d`O7Z)Y8Z8"|aMN$V=*ogK3c̹0R|8N{vTPdժHT9<ԛ.U\rؐ| Fap f#/Wb<[JO矟܋PnfϗKV(E|%6&|0"욣"HPUkQx?W/秿~˷}@̼!Bp9qʢbwYEQ.rC_*LZ})Dq7E/2_\sM\c'-\1>lRu*4(Pi rD\|j6-} 3ꐳHsJM謟Y67 8ڄTIdk&I 撄JOlL8蜜q46s>wd8퓇Ѽ痚4%z:%JNgC i: O0ԋRE,BMl>ޗC;;bbcnmZVﵪvCg{({ 0͆굸f18+0= v+R<3`':N wD.{zߊN+mQhGQnv?l)u](k &Vt$ B93{hǜ1:4‚S M]ϊNϨ6]w~i͇.gG,.tG ҅) *H/V߆Fι޽<7u}7>O&`m[8@u ʔwRc5I+qm 0xnntJkb!*&[4[=43|=$n`F;H 3??;MvϞ&"L0EXimRg cS)nUZ^mP-lm57иs8UhW^l;Mq4-$-:s 9sRՖ3~:T;s\kR-+Lh ]ZtPJ7CWxͮ=j#o |dj#ǡPR_t7+]c)i]B\BWV!tPS+B_v}V_*\Dօ(Nj&tՁsG4/ѯElV$uКC;8,ªd!q \U|j `X1ͥ=Kϕ_֚QUz 2oŁlT8`1fY<65cke x自._+Ps1Ej{ aJ4Ϣx %,\ F)nw+GIm)$fXKb1k ߪJɫMʞ]gV=k=]nѳw1^:>mr.m1Z.ݐ (% 'hQ”P-+)i ]ZnRwnHWH* K,[CW.mvP S+[jW,ZhȣLXA8r~ ܋TB}"d|Cr\(O8!c3X9%bN42aJC܃."D3[mV&n7?n*[DWX?@ ]U[CW2^ ( bO$p hwuS+ŐBEt8\՚eP@1w () ҕfexЕf0U۞eЀV}+@ꛡ+fדbR?ij ?YG]mVǡP=3tE:kc0g{|ZF ATBԛcQh:54*}递 MΨ&-+)n ]\Z]=E\!-ZDWX p5o ]ZIU@dGWO ITgp9BmV}+@)0 6UA7lEW-{ (Yg >EB[DWR pykV*Ԣ'HWRJۤ]R.ѮZJiWOU`ZCWaclEW-eNW%=ER "[DWX\֚}W}WRNv芮QwoXw\fh#]Jg+]ю4q7?q}.{X TEpn -b-?6W-R>k^WԸd=126XR k#l$^=Qll@LR v0|e$[l= *h9M^jF> V Dqpx}7[^'0kM|:vP'ѱN|j˫qﭷ Q:.,ފqpxåHY|6mS{[:\46:0c5Wh9F(0UDlYu/]|3HXN; 0=/z/g_J%E X8c ux > E˜$_y |XuG*a~ܲ#Í]_h"U⊪ѧ&1/Q5Ȇ)w'U ]#I%ݡ. a07j OP\k| n%N`f3Sз̎㏬b6E0WsO22RMJԠ-aE=V 1)U}r0_Tp'LC/uAi1aFv)tYld@Fg)?f1[3;2'l8=[ UV~e?Ek̀d:?>dB@ !U@9pu}"_X[6vCׁC*y1c4.KRacҧaX,,ۂR!) nte9R%bi)l6 z-rGhVvtMkB0=T+?EޠdߞuhӜs[kJA!{ߨCw j?aa,E6x׏ƀ5a?<>Dy5{4 KR^!2~2:!&ג_@#!#zn K}u}H?E`+lgG*yMϭZ=cRhbwX ݭ᝛~ Lկq̳Ŏ&,NXcFD8S,ͼ3Pmm ԧz[U&]q6yOۣ3q-jp4aճVh'8Y1̉IT#j@ZVZ 49"x$w[vn#Ȝr{LrIh8F-K2*{`r XDWnϺ -+99LUIxӘ4U&[P-%.5 =6+4F1䑳kI[$PxMʤDhbf٤y 6D O4 A2+S03,UqXˆ x ̏|Q `]G7'a7L$0 x31+ǘZͰ ct(69uPS: `2O``LT;m`~MI "JK=թvi)b02?\yB8L!oFl,* z1G|0tHr\p'șE٧|ODߥ Xb쇐( X~.z_\uOrS-]o#WŶExd y/do`I߷ؒ,V˲L۲2-xTW:~J^N< C2_Fe =B19:Q?i (Mx%Op&#Pu\IՊ'yaHp1;laeHcZ,f g_jIUw6w?ONoaVLo&3<+N]KcZݙ͌gm}7:1@17L|nvP[m٬xsjۏ8! =yƞӠ<c72bQ-p,|4YNtgoЛm'2 zm .zˌܱhRcMQ~Nw+Ϋa~_HY_>zSox gSJGQ ˯tpFEԔ_4q]TyJܩC ;7>E&;~懟߽>wǔ7߿9~k< 2&(3 ߉yN= ]R]cˮA]6%G(9~/kPb? d5m7u:e4q;sz;-éjR,F0\<.wP_]㦋ow "!h +þm4O4X|#%NФZęBȸur1C-onYJªfd!a i.zoy Rb b`&A%b@$,İL) DdhѤ5*r 5ou[H=q0Bv6pͷsn$`U Փ[ylF+>X0`o8V|B=.[tpn.WU޷яǾӼrdg@zOYs=:߼:Gxՙww6r/CbllqZŇXm^7EvKXZA-f/Vt<Y[v[Ԇ>^ZJan-/W~kzRwIqgO]q|wu 澢No|۸x z!=I%WA`J6E"tJTAsC`V­ʚtKj8I𢾺ƵI~tb?uڔOqؙ{PJS~\6Nr-WuᦥӰAϴo&hq!Cv3bF~}}{aJyuyQbKP[qtm-l!1 nz-{VF֓x^ZКRO  4 QK3&HyGU,2@ kvMU'um#!- {CZ|/Wl$8ܮp BJR-lWdts$i 1D @$r^ Hq*8 GhAEs-K#ssbÅ3`΅p^%ojNzyWQ66Q,簷I!쏸OVVJi-nga Qy0=ޝb4)!+HE 7*7P%fmo gx2\JSɷg*|/2>j@ۯ^5|G,/6%"IQqXIbAq<,Z^'>9ZVfEKy)Z9QXa$idG+z+(\_3zI@ZX(vSL1"8O !ks^^Fo^$8qmGass9<&n\ `:iQrIMq9H90t^PRLU$ T ]哪w5/yBՇ@Z 2A(XRZ;c'kr qy`u!BG>_sU0\=`LJ0%;𜕙Z8Jnڜ/0g%E~bm vmrWɸ ((UEVq 2Jkͬq+Zeym8 Q[@lme+]gm)ke§z[Qڪ쏣sAqH5f"y?? Cٖc$U\D[=Rj`[ekS.)5"'FLK-RDGSxKii %spH":3AJsH<&څhe3(3Y΢Dzm*|S/uj1qWL.Oz "l|i.7g;k5vO>CO6=sP#hp "DC@1bI\rAp47qQ52J19 o5ɡg:EKbLx *yRM97sF x2Rų'_m- :Ecn+Z4K!e iFl(G(-CU%JmuZVv*RGP=I`ʿaЏߗ格lbI"T0\z+QI}0^ $5j%1& )43c7߇\J!r^y|o=ث5KMϏ>/IQ>N-A )ԠD;xR6p(0bQ:P"W3q;뜗V"܋D{ҝE j6: Cl۵Ӟ$&u7m)Qr#/eU)"yC:Innsk77"|Z-:2yx.ނGY}%V I*#x#^#M4CnCqem+xdvZ(zZ$!|,H"1YFL $r9I<u"HJ"'uyߟ\mE5}dޫȍs[LJ)Z똵"HuXZs 98{g ǜުb\2LE?J n"Vfs7/R !xӑg\KxcQ'd((IxU2S++me)1V^L@,t"Ƽ;; IGPpK#Xɬ $%Al~Gw~|[@l4]N$*1 |@.g/:f8;]<9=zLzG&|Cvd4|z]aMq2&18m'&'BmQBgVNrhY#7;=sG~4XmwX?t$tcBXuitm,ܽ 4{\o"aݒ!88x~EI8~Sٺܥu6Έ=3Z\r}fl{)ߕoid G âAS~r֌]k4Ӽ$>:h*e-a<Q?.JٛG<-o~;< ֿV:s!V"zY*5 du ꫿MF/>ӷ 髁t6A>cqF._8zVnp5^f]LH-x6}8G|]ms7+,.cPUj7Mm6sR.Ȓ(BR ihM^,k{0?DTɷj޳?痵.Nz|?js~}~RkWI=J5Y~8ɳeMzCd]]?-=]Twuzvqvi%RgsJ,I8>ONB[{V=^ xrr?_ ];~ݏW]J񔧣TY{gC5ɾT/NO/94^^[J].?2hr'6J]<-'oMzr}'<`oݣ_mX^)hRM2i5V,Y )I-9nJjbmv/uZpҢMWe2 oh]kwnv:mIW ukhEǕEav؎θM.{7[ .j_nURy};mk)*DrV(϶O<7>r%]C0.(FY4>mJ͂RF:OFa,V_jEhx} 2b#$PJ Ҫ+9A;ҵ{zM@j`,/AL7:rз?rl~Y< *yLJ#gqvX0ʵg8?:Y7uRds"6](;/E;`487 H7HtaGh[bHdY)!(P6hXL v5et'%Ig7bӴbiPPG=ح)X9RӲ gɊd!20 bwH6zp}9 ]JS<,Xi!g(Y![ƚ9FTYQ482FRIDqh&nEFX^ |b ,3cc2!3Ӄu]>ﺜfwtou;Zȭ-X]tݹ:c>vJAm-Z7乸aaخwmx?q Uk-C=zsy1zQa;|GӖi fk`K5WjwI<>D??HGncl@s`w&.QE 'V!?x6 v,KI'&_?LR1mzy+ግihq|/軕vutOJس `z#~og}8WI蓣EPeB uB=Y涋9b,߻s'<ذD:~ahy> ՋZݝ,rt#7?|7Y8S_hR&}ͱy:am\]\1NΧg1.O'쿟{W.'=-_dOS8j8MOy1,i {?͝E1`6Ϥh}[vlnaM#u:H0y9CW*$3Yܜ>䢨䀓b@E(lsb}|gZdggc.2ףe`{Qrzq(E oXɘR%ZF =Hll& D-x9ģ4@kb8&~ KzCP}ǿbK*޺Feo kC(}Bb(ʀ)Z˚D%:@V 4)6YYW вP\&ex<$dIN55Ĺ.h)ñ1mފ@)h&JLp,;ZM ] FȬ;,M5 B7y+"kYx%9@<9$*|I'H-rE'2;Ӏ%v+F7< Q!EUR2!1RYs|i=E?$ 1`gml866ͱ''?k[{+.WAMV]ْ WdiE%8u_Ji/<.^NYEmϗHLӞza=4~Ŕ+7bA/LB/`-hkg]Bh$ZioT)g0j,3f>0@{&OgWT#\=J × Z_B]_-sӮ ѓ//[yg߬m⨣wabtSnB2PCcH;&D$`VV%?a Or -"xFi %F2I ԝ<sᖄYgAArX?L~zV-qBƴT-5"3% -: x v|NUH$Q`XDrW@$,H*e c}(9|y d 8_| ALhJUQ;H> YvdxmД`(d2'U6hd 5T''>87:V s4a帰9~܂0d@$;O>ˏwT'O"\;'PbG%yjvY}(%z!KGmfrKg=^qEImq9VU&|")6Ix4xB`NHV, MVb`bT>v) \XYgnZI^>Wwo2MkeR y As/N ErV"$b)Ud e5Z[v1zl0JE5>V70E4c GgKL*Љc BY$Hz0h:9zLC:cL^惡I7:ç<Ը*H 2'2~BB骏{YOx\ÓSAJctSDС >N3(68}sT{㨖YqPpgJ!@V KxNIF2Trva 8UGp>'Q`1:*e8(5V@QfG'Y<f#m}nƚy>{fn@K(h?> >`2fo=AUZ5Ugz8_!,2R/ɞ N6kS0o"!)J~g%f) @LtwUU]c`LijW?bG\x8JQ4VNg =u]!Fse(4*fVpTCqYkAd^:R-W>c"FrLf ZsʥLI`KjGs@PublP&sPOha7QnYg7v/[?d)kBʻ|cI+#S 4IQ푹A)\ DpB4c"*ϤX5 &N:aXF[(V%3<9FeQ\E&ӳtM D|ФB[VHg*P"*8`wdc#i)P J`iTDʂē*R'"6$Jj3M6Ǫj7@hUq @<1 ($u@9[Tm!+;0q^( yQX}d!*u4utxjL8HUCC JU$"TSWI[KԪ~PuG;~Uj!$irXW;@Q csDcA*g.,8 >:D2tC|'sEj#cdBxQMǣ~*BmВz"'Z .2Z&@v tg׏Fڭkw:Oy}rPr8֬8@oWNj:9j=,O!OnQ$7ROjKl$h.jw;1/|2*X3n%[V= I\/ JػqD@wEqЛb^W^ V@fZ,דּ%&|ǚ^r"\1ƬNIJ`k&j6zlyVAThMqHu=2Lwqxzx1T_ BI%r*Qp=ΏuC bMm ^EW2:l-mL$ NFDhdE-zL0ǃR/285.DH[xH*) r>&% >҄ 06W)#xЉ faO ^*!3\(g.ƕg{ )׊Zz sUAj-3p*cNߥa40TwuRS+FBBL4K2rݲum:PE]U=b0^S8~G8o{o(;o}F&#"{i<蝛|?gNg4_8}:1?!NFPIw\*M#Z9=tG~xq̏؏q{lzh-cޅyRn}Xq|{~D-OA5h[x4_ջ`; ;#md z_#z5+[n]?*ߕ[oԂlb 3͜-=Fvwٓë71=7vmϟ{b}1^'Hy?ST|{oFo~/&ċQ? {pqֿ@ϗ899h]7r7< ~qxܺ?hҌvm`Nm]˞w;nbY8kݿԱ7DKJp;:GLi'V@WԿjp:OQ5^~QZJ>ަ̥NUIz&7WMwyQ.FR?4£<󀊦u^U46i6VHY%Tb?.Iٰ6 +6m<ۓ=t2[#"_CQk]hwl6lv}l p L쳂+myyϋm ۄ혷1IL̤"ypE3rºO^;e &LD {wbfg>^l9r#] !;e$IA98r"gNAx#A )FJj(&BwPHwr`r(Gg Վ}L%2)X 0u,dhkdnňPvB+ЧjW̰\lm2,]E^$ą/>,w~.gD]wM|{62j&^p\c7gRc<^ʿh|ϋ7 "8;^ #Xuj_ȵ=Z~Qxr!Lf{}M8RV(ލ"z˾E2W,bAvxzլ&JdTe*9 F.V'_J+!yԮPjקvҬ(Iq'S:2mA69(dFbesUdj㦫S_H\[',Cj58J`$h\٬Yh?ׇ,"ygvLə/ uEݰ*e{{v9WERwoͻ ?JǣT&99#!i )LD*,/\(4 Y"1HX/25Y)XDa^)FΖdCSS'D*OEw 28}>)IձSlV(^f"tBVlm$2&TGǜ c^"ÌNɲ @f+72aQ3_ \|ccGjH^Al砕RNI`<WT%E\ Wv#*Ϛ׾R|CUh+Z1uOtzMA"Vw/T8sue_Iy:-Qk5ʂ1DV.R5KAs]f n |;-)'(%0xC$?ĠDWHzM{+l@: $m ;;w䆏vV+Βn'Ud.R1?oUbٸ9viБ&R}}SAhvea2.HJwT$,A ABE*H2*Sa~Y:Q2I]H%x 8A톑P9,8%6F#Ĩdb_e1.xpR!"e`u< Ɂ(MiTaX-MqkE!w dy\ ˈ:e$ˆt.\ M4LS))#$ĥhīQQmQI-i"0O}D3.QT9Jgiͮ9DT\9Vj;P"]#x:d +M59=)]n {#&"ϥJQdT3@%s!iD&/3eqB,[YIBH#7t"z!Y`RIAQpJ TiX-ƊabU²wpYR-akn֙6 O':.A6>suR xM}JQM xsu.V 3UΐM'Ð=pgYlrBzD93nZL(Jٻ6rcWu匌K*?xmVnb;yH|TJ\SC-IYvR1P$^L2eѵ+[᠁i4n4.lbA\}Ŵc_6Ò"=5As6QZG !%G[Gb "0 l>~A=dDŊG&DYdKhBxt^8aw 0E,ZDQ"ERL4vhG7>:iɶvQ`vqeD+Z= ҹ5³`jX3Ҍ$@])..Mv=T5v\ |jF>nw~FُLg?^QI[HHmkmu>\n'tRޥ9=Js{H] 3c2UxGWBmR7KW ֳP(SxˍtU,5L6+t4oTX;]57|us:>ݓ\`kL~(1dazNƳA3WbDžz`][7%KXNinGNvw岷7F9ׂ qnK蕿_O =ʋ'8uZ|7=tpvT?Ξ|+qߖ%-p5Mp3\&AhQ[V+[54qYPae6=jʔ2r-WeޫxT4NhJ>2owa4 d5$j\Li-7p@y@Y`=k0=c{ȁ2+00#Ȱ#jZ'OwP;peb -y&Qyg/&C֋tg#Q]ƧEөY7JVQ)$@ZTC7E*okL 1yoj^R<=1GTet' ' 失p-%TLy&)Aa"H[qܔz yL 3g,gX ,G6@h.z0 0>7w sD$e)#Z4r70:>>}nTyr3]iC]H+oP]s&V(jt>>PN99:Qs9ž׺$" /BCH'G_+.J珧lYbn]Q*?F痁 1L| y1~mkH#m]gٚ s(got6=fjbֵa׊|ux|&rCЧếM^xy1>~j2'̙kRwr:[nv.Pz ys6%JҖ@ٲff8,mf^YD 6 G0b^MK7У6{gM"VN.kui_5U|(,n_vsb+Uy=`pCka~,NX?T^|~ل׽ѤHwλW}^^7 W*l1*?}MJňZU~^3\)N?Wz}ytc_%2e"PAM_iQilkkؤiSߠ]KvyI.sg5!1v@`Y\= @SGZDIWgn8Jtt7^c AJ%ưK2hn񲍱8Y[uHĈ@+Þ. 4uBĽ4iEXm楐*2Ѡo TN  h0!8xёT>EH{NRb&u~I.XİL)pD"GhRr؄* 5}9^L':>Yu5}ɻ6"-XQFGH>XD[xFLqND$4B=R{< (ruΘ%]N?ʻXyil1ޗK\Bl!QU-~}KW?y7Yi]^}]![BwVkykwvw>O~4%b2:38S98Q5u6_oʝAг' q_|# q9V"Ϟ;\.L"5ոDؐjF)xw4փx 0W%j ~wkn](|s= o4>?8̱<_[^!ܶM4?s$ P_H5!Vh#qm~ Rol[؎<`dq2ũƩJ%a I9I$fdRr_;5Dvef0Z(=0"ALjM<(ÁDQFy:鉶6Sxk8 pfV ;{9{YGr<ׯ<}9oj(ν7BԨo%35NjRra8txpVtSKФdvRr43чLDBe8GЁsT@ dZ Ob⬟35h}[w˳cS޲Z7s7P5ꮶAOGͤ7[98>Kn|oR^a7Rri\ h#kLdm2,D(Vݧ[ϫ}%ys Cɳ6!.#4H9f-zIajovA ̱aP(ξ\{ O݂fn(PYpg:]ڵ羍8ʊTm5N-XN^ mX{V>YCSDKBs`Gq 1DŸ\n ^#eKC,cY]7H{)VNʜsJ<7ݒs|\d{x.ʝZ;S ߥ}+t(RXS\)kC)3JYCqGY\Im9{9Hx*ӷ OTJBk.9Gv{6:aZu|<:ҠŠ[5qj\~x=&Gvx>y:ʘӺLycONRjgc-)sxCjC ~K]+ϯo|jJ: Wlf/G?d!kd7Ί'dn '0ILOݫş;ↀB0S?F [ᗊӣd.;@M=vc٩nnvyuChl^`Zf~Y?ϫ0D;lu"̦/4kתbsfk&Zۢ~󺗄Z{o{Zw-M¾S5mHۯMBpӡ@4yYc D%Dt8cE.QzًcHB%BJ.#M#P$M\|0qKSFU6F)ؔFP6 %Q\%9k5I, /&Ά9+M~N= ~8gB-#¹DZ:y5ST+cJpƤq>qm4q%-eDP3 E-hWy#8Kp*Nݴ[VY?QɰšzmzҸt1=يzܝǶ܈CWK[aI>koVW68?rzm͓rlr0T{ЛʡVyC@ttL00 j Xgu^dXjr=>ue-v"ƣI+SRGMH]BUJZroe-6<Cvl`;8fK%A1TB.GF&DR+]bgv&̹biǾXm8X]%E"zj#FiE-R"]Wo$\9R+zښGdi/8\OXٻ6$W:b; }(Ba˞bmòwv:IX @߬I4 $DA6/tuY_fee:P4&<֝aO$x(1}l+yfȏ(7Ql>.41d=器{4  mFOvג 2/j;?'WQ7$\+A~S%^){jYi%A6xQj*!aA8fֿ=˹yĭa+>o=Y1̉H***lr aNn;).& )Wuj3 a2oF-˲T^% 9+iK1vL0 >tJu;"E Rr 3âҌ;d[Ҫ (O ;{AtX'&0lBgN^rHlf$&K֙̕gLsauN3(x5r//r8,v3j/cLEN )3#mu=.9?;?bF3 z0tQÅ|mj!m.aFx!qIR|5]*are\YLNn\O!Z@{Ŋ2_)0]zX2SPKA)g&pizk0elS  :z!,(MZX.Za}wO0!}wľ>KTU$N,f科]T\J43|b,gN/ks%aۥ8%oNi{@R+.͗JձbiLM.̌ge~zEEpv3\a/깥z> " 7I_\lra!Ƹn]Km͐f(kmfX#0U2 f0bGDOm/כWG ^*A[ֱ*,I:SI s .U}3 )d.FIJ՜*w3SUK盏 IBiZē)~.K"X/h4'k-]^:lePPqW$_ӛ_wg|˳~|s:{7/+X*a ܅_w@8{j_|yiT.M뜣n&CrK_i7G]J(6p/7tn)vFiMѓ%ݻ_/iJ-rcQCUINO`T`'`, *0wHsIMi5֏{?{_RG%S8L:q 步2@x!-^cÎL%aYB2u<(>FBhIPh! i-1F,qAX0BMա F[ et+l,»E=A2.cգ[iL0{K a2i&nY W*nܿx ,,T[]Fe[be8F>貵+NYoQ+amV)fsi7Oݒr5\yå <3cĜE N5h4Vv3%\3fNbEWRކb0J/>\:GN"?CAzE6Lr%խm}n80l4'CMq3 mz%]Fwwam}ҥn?+m\na(%h\ǥd\(Jv,T&;}1"IsZ0P&.4S"LJׄPI! VhN21,R< d|ah CR_P,,xGI4 G23q3x45yˤ/;0e%Z.t{%}9Mx1Y39ZeW-GtWmœyi&on"LE>9\Ɵv_RތNݥX-s[xf<4Px7:BAPvկ~YW5F7YWAyO0_' Xӭd4/$2BG&Jɥ)Y44KCRB9͎7xm}JŊM3 Nf jp 4m -]M>;2'A^+k)t"A}0A*1j `jgA%gNNAt!7ڣ,.s4  .Uթ`ZL}]ȏ@G^B[qTHfͥaϛ8YSlVQ]?kYMCS~H U~*)@4!à*i/9v֖R2|%gS4a-m_:= ƘcjK[.XK;dOkXz;+.z>[NuJH,Ꝿ3uZ6}-5;V0^.9ŊWet6ImqٵZxkMjF)X/yuҟUYתht]:S+F6[l''o4ipΟ6*ĽVCցcU$<\q\i)gBY+v%Pj3T/)6^SzVCː+k>I:C Gۜjˏ[VUw:ƬJuhŽ(ߍaR\/IjLEd ʉ*_o/4j}s})B ,&jIX7֫:+`N 9SLIɁӋF+ݥ*g:nү#TzaOs@/غAalk*' n4}4t4tqhK{l/2[n-Ov/ åF^DHJ!q.4f\ܭNQDԥ :CW6~ :&T"5dP8!6w>xj _ l$9wxug Ǽ`PR.S8= % // I{WSP} >6B4ll=#.U0 LcվL/Հ^P vi [n)JݗA_I9)J |x=l< NF* tQVq#迾u;0B澾Ռe| !Uscj@5&RA%[*e&E -YS1G EέI`Ȣ`;X+R:=+ዀh}<[w ].IM!kzLڃ+=8w<#c 68?XϏgY{TLA%:X(]f;q?H`İ/׻}SJJ+S)*E0DIhXxD@0m Xx˾+fzEv4բn 2n#&XΩVɈ-bQ1H0j|\yHA![e氒%%c88/LrV@i6("I`ci9,H|9;V ϓ =p;k[K?Vqq1jlx6{%4A N0T+LY'G S3`qI;rn6-WoC?>)v +̨ hb$%RT9la?tZ`IeBJnlȆJB!DHD".AFT8G`fHETB@43eNG>;̩llJaf罧Q;`;*5_'0@0DQ+B2m_< *rr@eHTrjגKUO,vr؃V6G='.>Ԅ\@BDkk%mBpA?NP8Aɹ|9 ,g',".4'{K!6jk1Ձ )SZbƜGc([P9r[>fzr0̗De>p|;)O"hgFBhwĽ c6,U&ĩq330VէCl-wAD#$m NBzHY)"䁇 4;.mDr:$=rz&0- tQ4ܤ40.`/2x sZD)Z2li%%I?%{;T8[f8r.WMRDZ߯13$G2,ٌ"ii4ѝ^o]XKo1͋rY S(̕DҐfŊx.{EȦA$< S"{S`5l:qq|38){p~Ͼxk3/ `v}A.@{>e}J~HCP 8ߒSqC =RQf5B["f>ADB:N yUF7O&%@~ޒtl}͟ˉ.P.iwUӇ23~zzZ_vSBMt*LhXNښ@Z#;בIaəj'o޾ٝf4>jG A^%{|o>S"MM2uEJ$pc B4l䜗|\vHʕ|<1ع2_c-O>;= AWvŗ_+"\%CPXPl襍k gT*ġj{@/<)uȧOjo@mx1qK@zy~i[6O147V1W70}|(!%6AT .0dLk+ v1YlCdF4.V3RkCiBR1YjrC܋6Hn2eDŽYD:w H`#ӁY,z9ȨKvx C25/ż6=~鰫&K8Qݦ9Co (K6\K3!za%DŽ)-! HLܫ*{sDIRT@.>mNpHBuo\/kq ǥP:]dkW`GW6>FUN 7y?E hF*Ճ I#'586롍V'=oΏJHxt~$×Džhק4.dD!5y!DRr s1F X\L;>(}1:Oc'6{g'ǕG.{m6}*9(24O/u Ww*ű5Ǔ`t3ҸsBQDk# RSLGR~ޤ~a -:dPuD  eoAdOx%^iˬj\EpH>!tBBN* $242+,ds{wjj|Ef!#X=xeI$P̪~zVI11Ho:;L/J hF{Uan EuE3ӔG%?X+K7qq}gczV{raO7=GGnϳ1S~(sdaӳO?(=[\k.8(1n<;xsr|_Y4%J +ǣ%-Z~>޶)E_բuqxVZ\zKr,ҵNZ4<U%Y$~<FD޵{3Wsyo+K|=ǥn-d:?pReV"8=$׸MO[7J}'F,uZTF&IzWz4d ^*D>I1=IsiGe* K}SsXA]4Yi1WDY=Mc$sPj I*k ZX՞3MQ F(j6E[gytLRڙjԴ0ě ocQqMlbË .n;n }!l ]̠ M9'],J)Qx ,+ eIpmbA$ ֐Kj.[@/tg,vU~y=p(?u_<[:IіS(G$#]2 ;$ΪҊ3#^/Jp`ȇYW]_o˽ƥev󏮭П==rLJ zY<8J%yǧ9v9;>OZpjEt;q(.E5c|x0xo5:%ѽ8`b_ɼnIhSB>(G8ٞ5N[.@~ܲ$[zBI{`+$'PcO[4K5$}qe6]ǠʖJw:RҦ|1?1i$M!Q +P[t&zuMfW!S_38Ղg+דtvDh8Omhᴖ+ЖML*~ C)߹oᅵvVʋ6] $.J7M~jIõ-+.Vj(5VuN!׼4Z>|%ծT[GǾ{]*ri (W6'rED{I-X*4: IΟ6釋{%kV ]vQoo#,~[c1<qAD3B`Ad7D4;TGc^{ȔgrViZo7{$h  #9 reCId6,j ۾֮N|nAqATO\KtRII S"YoT -)pFt&0{tx;9!oȟF/]ᥜ˙oυ6Y1fdY%"T2ҩc9Ib-A쎗mKBԊ1s0fVA;9zBtd%rLhtvh fNzbǐ!bh#:j0X} F&9B—8`PIĕ3@@t,NbI@.EB`Yf:1 !c&$QM\4NbA^gdN&ud혀\lb"=dێ4L1]Z%)$VOc hs䄢\"Dc4#Erj sR&lEFVqYJ;DSTW:YR24AS, 8F$&L8|$dUES+I\ִH#ͧ>Xt", D0yFCjS SR<ƘgVam$q@*3Y)hRQ $"$f!tKRtIly"EZ*+'NI d $w41pMLZ[j)IlEn _D/v9d%fj)HCIRƉ1)N{,@r Ob N*rę7H]ZJ8 [2z )<Ē̒ѽ M.'pIA`d&W yRΔ2j]koG+0Iv`6L0@F&=;6eQN,F7lv߮u9}Tũ2إVA@9%pa7X7(l NB Aꕶr{B{RRRj+%ڦgT0[QUjkIk ^h8PZMl-@E@I t(*мGwXZI tfp[Ҡ:m+f䥨"ҌY7IbLbbpB1M@ߛ!wض1l }ukW F=FQw ܦHB[oP\ڏҧ-hJ)h+AV*X(THPj_f`!39yjN\ 2QӪAUk2!a>hAZl9|Y%P VjxQ o`Lt+ѠKP-"W1hFyҶAMR tY+I";?XAoݷK~/q7D>"COXVXkxhDP(c ٣P.O9 T@@P _|CGP=5t3 l`JI %h v% Aܱ"-Pm%HEWPPl5 i NmB[v#iXh|3@Ф xdʹ^9vj:LMF"h4;Z3%?zP*ZWy!oqVUhV!X(&a!dE DeC[< 16ʉ6łk J!eќ8Mh% Pf%BJMKKު3r 2ԿIn%BGi/^q5*a*`V-6mܝgq.˓a0>os)d5P@0uvtrhѓХE8I~gs'QEEmCYk Q^y(Y=yp4vMf cNʋq݀a#[NJ3bO*Vh EA9)j=TFmݔ"I%\IbJtPXfR4Č*mz x"2`݁zP}V=Ö T ׮&;bE]Q("Ndvikn3O:|AWt&ezBɢ6 Eb$ӽw:P)ZqO +QUE :CcsjoiN}낑 ƚ&ELIeݼLFLҠ@Zv!;9yeA^ Qєj'DlajlJĝ5Z!JqVp*@6'=`eR\HO38%0^ĉrR9RmM\4*w3NH!b!C1 `VRpI E WKhAH_ m#inajp N}UQ~ռ/.͋ihRdC|l\U֕>%O> l0AI \וYhDM Pt? Yd.]'I2k^:J2=.jYۯ300Q=Ih<.S+?ql1=70ٛ~tˀJ\zZЏtcDj!^y1f˟O%;[Q5hLmt&`ˣX={CWEf(`[6)yo]{n6 ^=({]Sd=mCI$&MLq@fb D{>8`HJY Npm] y0)ruP)y, ,pC8GoFαY)lf0,fa6 Ylf0,fa6 Ylf0,fa6 Ylf0,fa6 Ylf0,fa6 Ylf0,f'l4sHN ԇ`@G}3;Hf'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v:; 'F8'8;(J @ZJN v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'v8C p;'=rz{Q*):0|`'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v8>|Wav^5_o/ׯ_ wi.Wb~md\7,չ1.zKRFO¸t5jA,`µE[p(gzpQ!mIGw0ppEኢ̮&\3ϐ~$bNId@#/߶rq8{ߤCgqzmBGd5) %ʉ?YTXcl 8~].i_]E8E*t"P;_4狋:٫vnA`WZb9:>G@Y%m.kp 46ǿn>蟫<eTw6l׋q\9%=AΔӦ2;?|埵&X!gk0nx/W3̞ο:k+#mg7( }9k15t<FZs(L352fjOA+ 8+k7FG/,U •wW h+E=( ]qݱs^ ~TAO WbuE`+pR9. 3k~hu끜}1|0Vwx>-Vf& _}lq:5O9AA րPSx78&'㍊1g4yv{ 9z1'm16Kow`vh&.#]e=prݼg'ook &&txyHHw^[D9vcmx}镣ȊkVk*T#-ET{>H.olʽIBeMQG `Xp:CE-'vTA+ 8pC8BQW WOt \Q. \!Z%c+'Wc` OnIzt 6alE;lx.D6Zcw(vLW/ .ՈvO?ߤ%d|9m#ˣ_1>]BцGK(fg]q}X[HZmfi 6q'Xk_;m#ptIܾZ10#[HCiN'_&ڒE3X:to[8a:=Voy//pa[Йzq7o~Е-;uGgu-m-mBraZrAu]>Rz㿚ve.ٹO=˪}6S_#-Wi1~#6I˫?Q/?X%xaW'\ QkhyԐ w?-){].~i*G5tux:ԨqnCOw* hc^j=__iYw5a0^>ŵ5}qG%}?i%m?TWU9}Ê="!cuVdR 7DQRu9$!BkX74 gM+=x]{w}wMQ}X'QٞkGI:JD;ù`FԘNgEmR[K9UJkI9GiLZRӻ=X '&s;a Íh(v`4WUW=pد#z.]K=yso>}~ρZBvRt͹K#iЂ2xUZIh\0VŘK6mnnTvSs@mFBZΆ|2Ogxs nyI{KZN}@Fpa%͝i[ɳ^]%Wo"᎜<N񫽧\w˙HK/Ծd|Gnxd c@UZBm mh= *2Y,y/FBN0)2{=glBkAgZ|1Z [uҁNIkRHMs;2NU&2Xh&BXb6" bׇ6|C=9w˲uW tjo?0bw T.e齹J6wwD'JnZX]A YiIO:n $RUB笴"ghXbڎfDiKĈ=Y8#v^|,=Yv<6`o JjI}z]ij|sxiE xح`B36LkjFTM@Q mծd܎_Nj#`|,8Y~ĈhoxET5U1XH%]o#7W|;&Y!7^&p;i+%d[o[-k,tYd*y`2^%ѣR$"dRJDk8cJ#ƥ/YiQExb5I%5@ Ns lY"FeWI.Ns]\-EN.nh%'C`Qy3sV!I!A.ӂ4##diCJ;xx2ζvǩC<<[okqr3QkIK`t#}y+vrn􂜺QԾ;8uޏm9 1i#1gtlr)q\+usd'''} 2^k4J' )*#l\ 2 cnx2.F,}:;fzv u{}IV&pn uՍ+@h{ץ+39#2O5GBQZ/ir'th]լ;vv>yg;r~F[?mcGACs#ϲL#l.7i[s>y?P8%\9ZI^r˿(:tsJt J4f7|>;T=wx SuWًLhجNef^=9͖fZl5[uR#e,aqZIhYEXK?:OC)Bv`FA uQ9mA[H!d bt1#e+RtYFN4[({U1RS1#4 V(3iWҡ3+LIX.;;+iߥ"?EWl[(&U[A9́8XwCZ~XVeKvnъ6,U}U+tXcH΋gܒOu~-,Kih{߿ m^wFZ|My k[IgmYMk&2 L-} F iػv|O'BLYmZZ wM~n6ܰ ڒv+>p%?˦',s0쐽tߦ푴+Zf؇m i~Jѻ9J#khp!(+LUs[1ҋsr miMs8V@{ҾteDp!’ hUց/SKI&f|wE͟;wd*|_׿ԛ-Ʈ̨I}eBxA*w$ǽkWOXÄi1]ytזC<ʣBR0UYډm n#C?[龿jW4uZ 4B*r)gd&]0HŬї*:nI, |Gj%t.ytF eGwGgjȆ8@SQe9:У,xaKz?NfP1ZBHAcY(5勒.қ@ШD фG sq;Nih "̬;3INs8f$I]o\dH\Fzqi" En9m#m O_yx.`%3JTB+.) *X1w.#U\%C$Hq%,-2ea0RCLQ[H`BTl/?_X\{ X[\?wAZoH+7"rn{*4N$Izg&{UiTarBW3c7M.'+Ś' o"S&~R Yx )ŖN1Lݰ 837k?x6%!x}=kK]Ȅ}1<)MVyZi3%#-H?o.JB>bu2r1Ş b*}ݸZVVojIGh:vw䳰S+ré3wtV\K(UozksM[6uww^c+\a/.k v+"jßn חnXA~59o 摮6Y~#Fi8e`Yr,_oMY㨂mu>ɦQjŨr4ehg ømNa?TXCb? d_^r[ ٪#n dP_2/y6K:"(0z: 4VQi/ IN0r+n_u8g! NbfӲ$e62+*3,萔Y+gYw$ݜJA^1 %[*ʧ\|斓^$[U\ R4rˑpi>V[sͪVFzD:襭.╺h#,'4\TB:ԥ==X/>\;Sz O.)ҚD)D'u dU"<c"ΙK ZƕKmy㾀rYMϤ#d5)v!aYqmn <֞JBٲpwP2`Pon9ܖ6܀օ=\t1Pc΋e۫<xg'z{P: )5̖l9z RO6AȃpD@J. `],Kuf*cV2S>?B+ 5rݗBTzO?)mx5yC-e'ޗ//B>|ʾ5(5|v癠4\K9in韾DHWVAAS2v )  c v4$#fQgU*ұtG捗ف^G?G(]Vr^LN+K=}CZG9Od˄I3 d&<^Z92eLOKNс)mg*2Cʌ¼3%m8ͦ6]mr_0x\YCk6Ny&FY-7"I)&ՇT3 Q*'+BSp% T$}e'fES׊(^9SRhW0SI!+Oy l(섍ܘ`3`7ɪ̽3ʡ+X r@5rvHw9˧dOr|<~G*M=&gټ}l7!ۮA u3E ëτWK:d RK<Рx ޿kꮸrU(_k̭UwPPUd$/%x7!`R@8{ED KyeAZ_h-P I3œT%Yqf7v/FΤmࣃ15C=-%bV90* *L +L@%bHZ"lJ-_Eb-ʀg`7:s02B9fY'Ì <50\{BFkDBTN*22 Y"\{pVێSoǑ`eZIYBK?F,n J^LB㾯ʆr(~VV4OQIу5]Iƍ)cA9'E%@MŽs[~a/;>;NLI,4O[&y#Ē3:?ֲ:hBYi#'nP{4KҤeBtV#3٤R>sZّN/>-ʖJO%: !Wem~dٻ޸q%W`~40؝Ir;7 Hq?< omq EUEX#'(@tUչ!  |z(_<}`8e.Ҍz<`505&e K_z;8.<4NkY/62R6/Gr,(攺T\&Jf,ة:b<҃tb[_:] mZG,Db*[E}H.s2R!E)'0娎A6/21瘞7<(93bV&}_Ԓ{E2zR;K14≮^?f:Odz|Ow_VjŬ?rSeY%`2=氼z^lnэ7ѿ%gj3^%L_zϖqb dpz"otGPО +5,-qC ^%S,qX,' t޾pi@%3\XM`8Oz}xBNxqgpټaqk +ͪJJ`::L0NאD)ҸxQ!JQ]W.4KcUtV1@> N2)1Ajn_vPlr(&#͜2,Kl$ќgQ ڠ D+@wa>+'"[CW\;]!J٥/y8tz0%F;]ܵM KW;]톒+]鎮n:u [_NW"ӊ چ[5c;ȕ(N<te ^ }''_gt2/! NL) 2I(sҗKB雷[Q"4ECI!̍C6ڝn`d1 e9K&ؠmEk:wۮoFH̄Ƚ6= 'V #͋`ߡ1/j c X@Q.B |qN2ϒɠTy- r\PӻW/~%V,AӬFYQhSM3Yt2/X(K-Q[U@$gpVօ<Ç.|ai֦6=OڶJ8S5tp m+D~QDCW3Z*l ]`Lk Z.ԝuJr "Bm+@+wBJqäh]`Nik 2BrR!]iZDWشgjҚ+D~1(EH\+M[ ڽ+@IGW0¨=[W82 ڻ>p{B NWfˡ7DR̽Nォ{^ bvCY 2;ЕCOV R=tR c7١RrA[D[nZCWF;M#JI;~4w}0n.|:ec /je;ir& ˕ͅJ.R6FVܜԖ͟LBIkOl[OD{?E2rE -+lI{ /P*#+$mDEDF;]!Jm;zt%2 ~dKSF A @b>H,Rkk )RI\ܚn$ZIjw4JD½;;]!J:zt4M֕֊PB\{oJDeGWЗҒ=WQBb쬫HWpąM{+Hk Ѳ߻Bsz8tez{@ Lo]}lOnp= -n7bbvtuӡ̚Iս;ZG$&b`2Jtnv\G;5Ƙdj(mBn.R~;gmvY Cۍl3R4ʌtyژ\nPNkn[0(=`Y4~ʂIvBwxO_8:@~qe16z5k7X<~VVYBNQ/E{V^B7_ n~õ!tvX |(Z|e^!}i_>XnKY,9̏Ÿ`J,q8 هիKǖdq5ynl?WJ6;Ee."QE.9.:?4O?M?H;&U?9n ,Kl$ќgQ ~ ̨d`^XYCqså?@CYr]<ƃ(Ox1^9ocm]L[`'6 q's(&Lr\|:9xx 7wl\~j*njR{I*_̳յY0CxE .)R^ԘpcY\@nu7n~M{`?T.гehX_'Rw:m$+B>,vco ܗ ꗭ,~$%˶܊D3cbYU]bUu-W Ff񨛘i}9Y7ëjt7Nxy[C‹~ŧX"1 F3x伎dra4ʎۇ&Amz}4g 6ӾY3Kd M1ej߳7>ɄfUz1Js2(=Y<48Cj#0x>Y]^ WV|IA\. k tqb*&P p,K #Ò/&8ٕK gn-8԰Cp[ 3B93{ڪw@a>~\#W2Be!JVlcRi[}rz֍׼CZQ 5`)811 +I"fJc<88B\T29h)Uh6H]2j͘*U kp1v6kR?+z2fߜb[=&ʹԲ#rhuDVݨ[!ڋ5Qg,xwb757^_=L ~IsP7ۄ$:pn F⭢>Xo$AR[ XBPr Sgnnn}rvNL?FC' hP/qa{R-K>8קgS`"()#Ek)1248 YT&L缲,N,B^`0P5)@ P'-- 3$/93@LR9PUZ$|BD 9C]f Zs4LQ{!q"Yس†; w-^sqU?۬i&Zk;pz2YWsgkj|yWxuaWnx4PiT%A@G\ddG CmCsp0xoZ\?TN4c'x38mLHͬ"םiY`t8=tG~r~#wyN' c_R=Fs,mr k/6nO/w~ݶa:9IJp9:GL8'Vw'7лO{VZJKbroGXնT*Ԥw]CN OKQ.VR3e~nt/RCLp u<d'< 9+} ݬ=r:;b7)I9\ ?YTm7;)?iFМ혅>.L 4oݓ`mkfcxX z6b8&GR$#l%5wAJ%Cц]VGSScg1 \aF'dY dȳhZZq'C!B7ҡ_FhrVJ9%A)@sZyб=ұ:WϏ~պ ЕUV Om:J'NJi}AIU r@唐Ȩ9e[ z;,Ud@b 2%i`G-0k'!A$x0HFn:4Ic}&1-?s#j_Xqq-?y* w㆟6ظ8',':r*ݪ 6߸A xEɫHA+ X4$sv}{ŎyLR'BJ.Y#M FBE$G`q[XfQQ1*[jcq<8gDB:Qr@/MiTa; ŭ50ms!U4^o\G"H ID45Θ2B2kO\9N*іU(OB橏¡h%.A :K#RlVԮGUŹ 8mSzy^%^OzyEؽ\|}QQ&m__qjĖRnD{E_ʎEo$#](a H˨00'80!xb٭J)1p0!qJ'7Hu,@ &圄UNQ) *m󀬱bJ1[XL2-[eWԄt'߅Q `4| WnQ'ʀԧĭp*zw1[Bk0roe-67< 0ceɕ A.GQXГ"^;-]L:j>xk7`IHMҜ́n$hAHI{訔Є#tH DT&4 *!gHyhkBD"3^6  ITG 7/f{؂YIz6bǶQ`q}7pbY0"8f5RBFz"QI$By$ An %Di8JMJS`)fDTQQͣ43KWcgEpU;ͬ>:IɶvQ`vqmD+\J,(FPfA3DFIh" H"N;KIǾC=܁ Aٮ_g/CLF>> TُojCVdr f٫'uIEAg/x|<4/NEޛ|R_?a-me23??[Cnm2m%[gܡݮpEճ<mf߳.} gv@vCޚD^d>﷘[ű8:2+%*^Jȹ/.6Dq }naWaZn+0D)1eN½d˗59(=T@&UUeN%;1]|/hǮgPSA~9/Yz8#smoϮΫeŏ]x6 ԫG[qp3mg4&;_ɓ FoKti`7/yfq?i-zᱬ],ZfÂW| ֗r@f>fGDee9:[Xb1dݳ6K* Q z'pp 蒍4?7II,A=^=Ug醯iNT~e*79}␁kBT'/(D n$p~;\74mymS\T"3JT+RN`PDn92DE:C^E)OJ Iq%,̖84 e'+$mȥ?VS ny}6H_̉`f_8s->5W\#r&їiI*#}`ޅrQQ ?̸_\L}O/NxH>$[dK pC@qNL\S28]@$ l5o(1b,y2Z~q7o4 iSEi8)s>.G#S0}zޜoK7(x!7˵G4]L㓰UwfOhfkˮ]?i8+>P֧սH/yƏדޭHVp_<蟞Vڛ]qTz]|x ;H -q7xK1hQ4eJ>-z|f?ys6*í.:Ս}5\e㲑:RF>]r1W~{Nǽ7577DmS<+fċ+KV藥LaGz4FY4[ [EשTh>b8{g$>?~/>qa>w>#2IM"px|P Mc]MS{muz>[+e]oK&JEB!~_v?zB­yFZgI7{JE#:rC)dl徶X;w JzN40^F( NӤ#v]z5]I)}EKZcwޭOZjWދfޓކi ^'ßUXBVhb11-c-ڙpz9p&n* 6 ?.Ų xT V?}ŘÜ\?phvZ.g{>;RZ$4t $_*.Y, NzZ2]i ab׻5+@x 4`e O.˨2'aT`ƹl[ew}֘sy/oe7;X?=fZ L]BCJmA)) Z!"y'cKF,ZkRHRҀX/PR )ɅN9BA쀻 jVy\k4guMR>XᔹE7~5oƭVݣNY\966^nlٮeMI`YkcqhK~"4MϏ]D>PMRSYϴ@5͠*U\ m2BXۻc홛ͳ*nDZpv_X[]Pִ]в҆+>qnW]8Z.;bL@B)^` >E,9X<IZ>>^N<.WC_)̖+u@V T YO33[^a^oD<cɠIFԼr^HOZcrF"R  G[VDBJtAZ"ˈ^L!+ 5Ɣd^f2׽X86Bj TP0Up-=Ti=wGG1D1eɤ^;$eEcMU)T RTT&wȱJq|iU)6v(QTcB,}G&{3regsFMY'ÌwV͒=1N5O!*LL2Yp@hr='^%\I{q&X0I)d-h}紑 ^{FڶTwR!,OJ1)35(&+f!D\bN9sRdXb|.$46ͭn﷼ vB*)Z$tv %4e"4A*i{F'l:{c7~uܐ^^ MYt{̗hv*}47TZ+.=WN]y? +tz%?/&LZhB 6x4WGh9kG(%<h{6G&"F]$] 7ITBWV>^(JjFDJ%CWT *vBIҕR[]`i2tpm2 Le6QJkcMJΠ֖}fpә rԇۛQiPc Uz_  O%ĆbuT`t<m-ehX?F?>)7Ζv0>E EH/$8U<եJJZ$*ؠUzZ[gY@)p`z=DH+N.3zavKपApz#%#ܘ˰dG(w Wق9jz{g?_-ԕmSnڡ4mhgj_1)I)dɘjW'm7%cv ]`Fi2tpIm+Dٶ]= ]ʵH)~g#&G0D);:ETY$.l҉[ S=џP֦wt:]=mX =џjzdgZy -MO{=,GW ]Q=֖PJ)g}K,/UNkev/kPne2S5+Ԩ\6&CE(e)Nը]+'x8UFRwT8)ނY !xrV{rΓ;AO3˹HURBW֘S+U$!d d !Jc::A\<(Jry5$޺BLwtut1&DWtAl*thl;])nI۶QJ jiJ+ld2t#DBWӤ+#/:(]V8•"Be] ]Ya5Oi"6&CW74VT]yJG$gU#1QvjWU3e D]ݵ)-[ACsf2b1J +h[*FHB4N4儴e8OQZssIUS/߭4iYz $POD2i Qm?%]y'W  .KvkRNRFDVd * ]!ZNWBiB )ҕԔh]`*ұJƺB} PNDWؤ3Dp)IƺB\d] ]imb G Hn=]!ڼNM+T2tpM23k>^Pv#'IW+R+k{M.Ol} Qn"M/{8lc[ny5\3H#PYW]Ɏ D{h%U_72syIr+Rn4*4<Zӫr즧'tIV6[&M(6Dbm|Ilhi"LUR]Yw骁ή )G8ZQޗQ9* 5-taƩhʒ6SYu[h uД"ͯlWJwЮrᕛU~+g`8_]B?s?S/g %8L?L@UOײqt캕\>}tcH஛9z e>o-Ч:K ^Y}Qym뛍kiI6wZ2f|TRٿv 5mm{fZA-ͳMJ,fpY/sk"1_Dxo Ǥ,KJp4A:]T#,JrkeX5 l;|'\ eX<,\[F)(ѓYuy4ˋXx> U pnn_Q6Ny_@8m \q|4Ql/oQacz#WߝWu!bX^.lgyfwER-.*Szr x8cuI ?:w;NfbW|,|Ux?LO`o&7 V >qSKӚ:[կvPjfgG#7\I l{7>oY#9؈`>ָy0kSY%~Sͮst?ڷZsHh2糯)bBᴞڻ81~Aۯ_.@x%[,{>*hU$JETT]Ag) 5P cj@&@˩lPj8šȡ=}F:t[ɻ,W ~isJ=E r"Z0E ^Ҩ`eAu;OBW>xy5N~UO sp8iXhζs 2"ymoCYZwf8J`Q'a~5tkaVJzY_`pRCqU28YF(]d\ĵV ڧb-(ء;\Ǭm౰> ^A>\  Ʃ./ K*IIIɬ*uޯ'2SMF /!m` <1Ε4 bR8[sfUܰ'ڎ\\4 o&2~C}ls_wf|֡ut5:u*qA4[%B4 ʙ 872u`lP(+AjQrPJ,I/0KМIP$%R9+γ+|^Vp>1_ZumRfq7eFٔN==aѻ 0h6b. y~PzTeG:Sq[Y9lkkNV¶iY*u?]Rk(wkgOTcKS}tOtU¦*e7Jx9z9˃ysoV mV@w)Kۍ˺;M΋*͇z`MQpb&L e ")Wޙp9$s%ᝩ}qS0f7߶ s$3^?@.JQG9/ׁ񥧁"/DasʁTn&[Rہy8`x к'}~ B7ۏp5'h'\}&ſ!T2Wm%KE)*}S ֕&Aa*L+ 05u1RHBC ^E32ZK8;5 k~THbju;lb%Ho}>i;e#E7w>CzB WDo N๗Lt =NKc))[AHj^}F$kV7xy[r<*2%/bR*([)ՠс"D%j|T+}Gq151uCY)@FQVXHevfJR9Vxo5\Pzs:p'$F'5ՏHT)\PJbsSQXYy#YZ* aaQS*ҮeWQ,.-4ln%#3 y-|k2Xܶay5:e=f|0 P 5xsÈ8EyL}Y.c`C +W}LWOC.;Uڡ&< 6x g_S\'zԀ3)HPNXj9xY ?3 ^|yo(XtڋXdU!_+n>B NoPh G^?.İܒ 91r,k jշez˫Ji4t)rE=lmh B͵)UIЯ2-cVẟwn<.֍OpBj`.݇^,뺭@FgU+wG_ BvD)ݙsܕ ۝b+ԅ(bqQPXؓGÏ^9n+ ;se\Wܕβ:)0ĕ`k>*3Ƌkͫ_3N,篳7)l/p}WP50xz%:O.{tek-QT󢈃vW$_Nj7Ϟ~3|?͓g/Pf<7/{}.q.TO=| ̬V8|%Uz" 9"*opHDWڊmmY8G,'1OdU?{ײ׍d%H$rbfᘍ7EV91>U|I',B!*q.Nfqݏ-d(P.E !72hrM3qVV nqTs{Wk56x˅b"3Ξh奅T7d'iEGNnG#.-zLjkCM,ɷ-!{ZӠVpxI-?ׇed`TDN Ib ~H3_#ǔMoa6SمZKrO9[JW9qIh/ՔjkTu=: fV[mchˇ>67E|vr]jUJiM??B tgnR?ٝcyy=z˟=Ozp͎ԾSϠO $>,53a;Ag&@R[& ?]o‡ɬÑo,vaŘagqpz@)sYʸ79k WL }WrPK{>lx'8}Lkav~Y8x9SV/ґGtg4F`aqum#e# !Yr4|6 )=̑&T?&yR1 BUَf!'f3'AԆb<~$Ǖ/A&9/3k]g ị/Wo]m_5l5&G2!U #/5fo)G!}h5G͹RɍMZGS^Ǐ`[vdk;>5N|imlM[:1Ŧ*fr-do6jbڳ&9͵bhR"PG.oj˗:8yQ&:7l2ѸY͘bEpwt 50؝  ٹ\\5y/I%)h gɻ%q˶F84쌱ڈ{H#E yG&WƩ9wL2A0!\&QŤwՃ=9x.3pcSL^D`nF8_ۇ-[6Eu) F$Y2e 9TB#|[RbQ[ys;hZ>E[c\m:%1=$lVglO0QG쳳IfÒ,wВhE;}X;'ńbtj.&i.'GE` XTs+j%$RLe1G]@MrS,w؊PDYMvkayķ0JE^^^1`\X¹A 8Fs 5H*ǀrLc-0g`:ɮTyL ep `N1s "7HǚZ R $_ĺ6h;TpR(Hi5 dQAIIrcT޸2#AZvg¨#Y36R FxX@'H˞*3^tCT7 Lčr&K$;{:UPO(0q˱ %׆!Ina Dա66>Us!~T0_ c6i&yj`$9t;ub좀E-0 U&yn+\7t!wJ9 -Rиxkl]?ދOO?ܐ hrUw q ~/q*?@N7~_5wtv\o8W;{ߺZLDb5/G=#>{4dZ"Q-a?)"(( .]@ˆ%K9RrhɡRj-^UtWM]džXViF3vOCO3n94rӏH}uXûVx\;{|T{~?wcWP_KKݯ,IݭB(ylP:6ni4!ZB5`6Sm~ wq7p͏or+|}[t:$oNt:Nt:Nt:Nt:Nt:Nt:Nt:Nt:Nt:Nt:Nt:N`Kt{9`ȋtD6:J}u::x~-;A.>>ݯOhjaJfwYv߃fjjpDܮ~;9y|\ꇵ!o/'w%:}}ǧkر _˖TPse Iԧva< J:?W2>ckd|9wK/s:栗%%Yvp<&"YU]U]UZX,{-E\.qUsc{#<\p~R5D)<#lXXE렡[E4Xuo]qTJWWFO9Q͟#Rpļ45w{LJZׅ*dCTcV` Y1+0fƬcV` Y1+0fƬcV` Y1+0fƬcV` Y1+0fƬcV` Y1+0fƬcV` Y1+0f˜P`̀`)YCl`H=PIp1{ 0fw+TJR T*J%P*@TJR T*J%P*@TJR T*J%P*@TJR T*J%P*@TJR T*J%P*@TJR T*J%Pz=@SAwKqjq-~P< miE BP˸ZKׄ?KS̫ %1ETh16 QT DsgA_[KoW[]oIGu=/W&z! g;Y-?4jQkpܠ gXx9-0xoSƈBQ{v)A,C4Eu1}5tލUsyt鄕x6tul}Я {yRE%v) Y3BAaI+#ufN6[h!򂍋b()Vi)lFiT65 9N2H0RyM5FC6rv+h9<M/5Xb&iͳ5Fl&I{?EpZ`C< \f W<`" NdLc4)/Hc."8. V6PWœGLf _TO)i(%[A$ At&A\dȡ'2`I$ɢyn(z#6)'^ L[aR1C{!PRZÜ5H[ ^PlR ۦEk01E=M LDfr($AU=qq)ܥ:LZ{("0%Iʈb$صݢV͎e]ǦӼ;?3خ(.`HQ -Ӈ}Z/ui,=qŃRJJSJ5M5w|&s7]y_K3π?qU}d2;?}w|9VrWnrr=^=u YKoy}ɯ?GOp r:ǟlKL_q~>xy8}||Ӕcr8YF?F>9!˹h堽}iUuotPGg~$ 惑Mᅢ=?q`- lͻ\\X&w)1:!Xgg1Fwuտn<}a_Qíe1%'#S)yP,@>)o#?~jN10m!kG/X\Y8$G"Kwe 5 B;}l*R-snt=gj57um[8(%6|.3'I缟/}r1Mךd#P[GeyҋI=x=䙵Y~{3I1UiJ`^<2 /z=x E zmaظ11Lg/ m9t$q凳-/h1j7{/Q}wLc~»1u&nHQr.*a3Ńy'oh-f)ACP7׭`n_xAe'ps0 B[P Sc0>9:=sBDqR?F | &e,L;^i7˻U&'XL [9k8rM?XBD;Ji̖x@ <1X pnAqQzY^@ϫ:̛h^Qlq+jگW+(nf ^WM/LVZ4۳[5ǚOW)߲3wo0:'Vz"uwR U }>*\MJ8G߭$t| *~zE¨?4cX>]ng^]'ͪl:+Y[EH(򕖜UN)bxЩM؀]:_-=!;y-O aD9VTnז8w懋M(A#߬6 &/x0qx Cʝ 9oz&7m᳇ӯA{D7Nk~O&z\KZmXFwoxO2ďfRu 7Ѱ>fFn͘Vɦ qơeօG4]ίeܓܞli-_~;2z4/\cGxX e. TpN[r6$Fʤ&fQp9}7Ѽ*OP& ʔKQ-t`ZCIYcg#gƶ_s1Ekgcڬhbn%QQxC bTK%ԁcP̔ B0hk, cq̪#88f!eYM]$>r8Aj}q*ĤE#fC5"ϬyшE#nx&^1oZq28̰C+A-w"rS+8F@7HZadXG!1QS, A94a#ֱ195EOmA/1:qɡzQf֋ŢF!/4p±bh# xF Z C+ GыGǢaTаY#wF63h(=Je?R$~sR(`,*dZd]eBZ@%9KsGH"ۈD;LI Q,C(cLD$'ᰣ<*XF9Ns<|ƣI1zmMfd.S/VgJZ6'QWMM2v#)@lp]3@M6!|n{f^ ΁ͼ6r;?&-ugwF/~+1x.&17r2^}d0N.}]]->@m|>/eo7l{)恣R⎔ E*q:Z%ZO ƂJ"TDc) 8(LSƳ Q,NmBLQZnn)A@` Z/5^#{C79{ ERGRq>(k3_"͈cUQ[8 &|C@թ> T8phU4c, V?͔pQ25 D1ЬN:.*1f슌=BCr,e{-1F !@` <,m;ُ/Z^qo *' J3j3ʟM#D.ϥGVco䑨ltF\MNٱX$1fV' oUCi_Oӛ~S|{Kwy`'+u=|ןBh4ٛ}C+@W3Ù3VU̜9kYtJ)t#$B\2q &0N+,FqA6BȭW8M o+,v޾9N8Wzm˵MrLp":T,V)\%E0DIen#}66p*6$I<Q,TEdtTQ i4Q9*3xd"K4F{h_8q(h'd\J0ꮷ Eㄧ^AF 20=jFxI_Ӊl# "{#ң 9Yka5=DYg,S_/pP*4;UYD5{$2̀Y ؔxP| I#z'I$Sd Z/xzmkaٮ_> ܃)l|CPER c [glKL> j%UJ!k5^=کǂTCf3dW z5 kj< YqxCE'Dƅ Ч ezFA,0:x6AU\K(KAzhR>5"8׾foO/ǵ˨xvzpnND%gHGo;@T&@:Q`%' ܺ~/)o=U`QB6U2PI = !hSVyn<$1}]&2H@gPZwj%f. RqV^ F<3/(وɳkD0ЬXJ{8^{Ki,5+ (7s롼}hq$(U:IuAuV p.{z5$H< Mģ t)$~LLkzCm1 >t6hW:])iWX/}rN5%\FҒbkL$1qX P(V nޡpz8){gMdFwxn\u8RqɛQ&'jYYZ"LmZ!3hzm3twήf"iyȆ] ˪b'~7Od04o&ɰx|:ٰ&R7ç "brˑLCcp)BmвpF=6όzxZߐ䣧OD1K /1M;g-)>O-<uc΢ӑI6:0vBX \$}J?bRh 1J"b]}sb:jm)me#Mѹ(!e&y iȄ*KLFT#fld(vËO_L=؎mw9'1Knݢ6Rєh%}. T)(("{oDX6T 2)I'q^0Ki'IYբ}DBlP0L9"VEfly oʶ^|l׵~ئ^mWY _Kp7 |6iUܥOZ{iUʥ0}IvBXatZ6tz+u{Ƀs{-9fqcE.|D3mxw6XWkOnA{7/l 㒪UV"Y{2&(6R^ާJ$fq|pM?׏iX2[p:q Z{,osJ1YgTNF)0HQҠ0lO`(!@!D%DsКAK`ZQE4 $RT4hRl$ٞzx;+2u>8b&Ohvpq\O&#֟fB" ?[q$1#  LVD3/@Y6_ 6+xp|zY ==4!,_)ʘ,$ Xud,kQ61)ry*]8xy#dz!MRJ& @huS"Y#jj-T 0ڼfZX~ަ+tv[т8T2I/"E-ӰY@Xو/A:Bu?+] 9fJ a\"pbhGˢ00Ko#TĶ;$%L>ƒMY'v*$}ܲ|'_ :.u9nMOk\wrR໗Lc::DW&%r7_ax1>M#']2Yӗ?> Ǜ[9͏/Tso7/oif7d=_~ 7FoMre}wv9odǟ>?Ƽǿ堆eοR O:;|^$?yKCLCZu~鲎qL?W5@ֱ^\]. 03f2`!pu: B.Xg<l1LN똜ݿ44XR7 ]z;ZOvoO秃?Wu1/L~h2[Tz<ȗ z;iqeAZ֤_э5nP %};=Q>O0'C&G|1ggW/bܧgS?R.t KdBX4W,٪hj| &UAb-uө1{5MY<wGgZ=k~hn~pr/;m1K*D H\y{J)ޮ_xI2w٪]G$5`}pQm`dt)h]lGҞЂN}t`E5]k! gv EE`%$!Ԭ H:H>, ڣ@X"dzM@XQO\ _~76<-#v/mߜ9!w/Ӑ͟NMi&4I>?^3p~q}lU3ϾlGmk5f٬_u1_ѐ.]7.vCjMԆuK譮ukҞR=bctjQ`бֺ<-r̘.'Nka`hSz.7~"N:[TbJ5rѽ`)UN`dMрh[xo"  f7yWlWSȕ&g0/c`r{.5Lg*5qq#wK;o/LI4'"Mey7"eGBg`բ`ϊTOx)?()(5:`ٔ0)gk Js훡Bے:K("%T%I*"fi c3q,d)@|T@O#"bR„Pd(DNK!Q6}E!(mP Ҳa{YߥLe:8P.-m0)jtZYQpB'5SC h6Ov P"`G{|<x:=~#'glS{EϟKɽ92d63fɒ 5J”yZ9ܶEWR*!4ċ$+#(',EJtJMA526g32*Ͱ uc,T==ukt3xE\I>Q . Q:ڑgښ۸_&u<nT:qjTԾĥf, )%n?!.(j$U.3ChtĭsŶa/4̑]#Q#E#5\ \x(0d,AbeGp% ;Y؎I%pGkJ'mdfow`-O@<#ɧہer-*EhyeN *).E;6;#gq tp X% ubP 4H/ajF^N7zp5mz ] JI*iG6"C:?G+Qk &Ĩ\. ѫ4ƛk[mk.kamUk1Uu]l;״VDk!OcOFRī;PXZE5&jϹs2'0ʭr\-y{h@oeDpHÒ%>ŪO_' &ӸIC$PhSn|L.CP̉A&g q 88n3r=zj2?_ѡ^Ǘ~V?P֯խٔlV{٬*g*h8 EB"y̓++EB5fP!< a9i$A"i%@dѱwFB<պ&1n<]t-E:ftܛCwlsZgݦmêӕczfi얭L+璳J5վv']lFUk]E먝(,n-'Tkؓ+o!\ꚑ[ O`6o<ˣYםڹ غm+wKK_=)Vwi:GkTbfdl2pkOQ" .v$pv^sz|cjHݚ0d'#}6ώxGokj6U[dh3W;2bsWɠ9@~E!bȥpi#8 &;D_ &&%2hAbFnL*S>A#苳>y-WZt+lX Oz< YY#AYμt+"K,{pK@fP1ZBH1 GX]25ܛ@ ШD#80}H4拸]X[澜)GN Q-r~=xO뗉LD:$2(S ͕(pHgh(mUq ǼnirdA%C$~fKLYG:d&|&`Ɩج%{92!h9NOz\sf @/^qZɚ+.Ϊ_͡0T_̻tvb~}Jթkd2K4_$;;9ݛTs)ex K=E]%FN6LZԢ]R ~_]jxq3ҟ͍o>N.*g7f9q{|q<`/vQD~t26'ܞPɼnnv3,Hc,XVAe٢!۫`{m9׹c5\ôQ`H%?P>~x.[5ר0EY矿׽8i\[W%+KQa~bh0< 0i&6T  U:b?]G7߾)}s~y{͛wG\ÿ{ rB96& ?ܣkU_5\k'D]yN/3 'C:j<|uU"JB6qbJ!CgY0)( @,**%q%;:ܞТK݈Ͽ{W-0R'mqiYj#0}tss虊LY+gvcQ:;?]Jϧ;}z.l|/='|U0=O A oLn9"W0) [P-"sYA +"5*Z)pEJWJpZիR ?s}rս5>/\݇Zypu/*Wpp i>~Ƃ=R皰r=ٹٯgn@/L`侄}0~U54dgdy-XŲ2.^}܊}iA W#׏~y_i6? cO@3A?`>Dp90NCi6VWpyU>NOɑ&|rR̯߽>X*̭6Ho  ' Pź J^Lл  U!WMBu+sWԬpEBdWZ^B5[zp%1e VB'd1pUWgx&$d:uQaKSR}< +Q{}/qb"K)̂VxoM)zcp1qYVg%{I΂+hWg,ydq!dQu`A~`,3#wHo0ɐj)g@eH9և ~&K1 w* $ ېd6@K^~ɉCBy|` re$ бvF݂> rnxAZźOV:yU+yLjOp &œ^E܉'W}jn{wTwQs*ݼ) :E%"'3T9r 1:LA<.maet1{όt )`z]>:ES\R]#cg܍*aag*bX[,|T,RTvv^49,7wg]h Gl ke\<䜔I! J*|)IbV0@&g,&@Fb(x/@"z0jad@Ж|m:FȹgLuAθc]P`qYDYE!XmJ 9,B̀Lb1_,=zv7={<=.ι0lɀ|fg{iz|}JƙG7mV!K9gu'ݺ mKs~Sm״zX>[>gn7: dW.s}͵WRҒВTq_G%KKB:-IRROK iI;EC Իـo|McCVXti;L^wЕ >t8Q-R NP N4 =3[ޭ )X dde\4xg\ VK >>ʘCN46EqɣQ0*ʊq`\@'cbrVZR6rN Qb'j(`BZBeRŠ̫}W =-  ->A˻Ck $xVYS/N %2;j0|RWU&^I(Bh"0.29Q7Zɣ+B&?]XkoZ^ۉ-@[v竃Aۋgz{>pP:)ŎH<42{RvFBKـT}x.a{>gQ0aVBP|1DP D,`Q`H @`ysI EdtraZs~du |">m=g칀lu֤zM_r>_Y ,{tL舤!a՞!ԏrg%vcYп7H(cRY.H:lɾOT!+3$ f^$K{@JX<* -12"-AsgZ̔ )vT$P`#*T06 $"YŢfo!Kz_keTqzwqyUgH:^ fô-oV+i iLu4^HP:ZVjZXrr(;Ev*4X@&VrɤD a+'}PĊrW%0AAGXs/P,h΃j LYx9O'p>`_ꪃαq]Y&nv~K>j=g7Y8k੒/ԝ!;,>t>%SYmSݻ:~WU\þLL[}ML~jUr2ޭ' ?Om~ uS?1]j {o̝|Za9xmLb'gC^b SߦTP%{x`F r"SpN;m$/|9 zzgj.h{\W*:2/Q\ب21(!dݱaӝ.LJHW&6 r݃Q!Ǔ.҄ iQ WQ)PA& HZ8[HX|:S |͂-d@+Dr b5,KHmi: fNji#%O G! au_ EegEJ4{Eą9'#4MC-ʨw_ V.2%g' Pc0ne~d{*i"nG W n955[x$-;58 B]c$#?ph|F(nvl"{Ft0%D49Djq4⦯4Q<Խ9_~bHf3O7*~RE'md;[ŗlciqkW|M/_CAntOk1=>>x{!̙2܂"^j_>^M"o?brO(Ě+I)^ i]F̴*sen._YSOl`źN'z|}j͡X{U%6_u~뮺^4긾`s&^>=uc_eP#oO@^kLx7G/~xgk[¾°nEu est~y6eq6]:J V5Noq8V?_/?Tӯ沈#r#oWGDh(zw.?oyեzKm.Zl·.@[^sݿ޺.77 !v?շq۷b27HhafW(1,ֈhe-uh/&)0Y͎NbGH29=0.D}Rad9 ]<}(f)JA<f3TY#+!y©.$ B'A_4y3W/e5SZ&=At} e\0=K=icL9ӏf;;K7zrq:-3JFK^vOQ.Pp]QT{`r#;rm ƮYw.:6{K/g͌bdf[ !z'30eR$ad {P, fX$^~}ȃviXuHhp-3 )O `<ǿ{{_柣LSe wW p9iI{ q_Qq^#f=-o|KB.{Xfv3A}zzAr:Uqo^B'* Dt}'W “yiG# 5[ ~/ɩ6Ҿ1MR*u/)*D{ߕ7]҆'ckw-^xc]d39$Oʛv.\3j?.B.;J={,:񥈘rԡRIWqRYޕyYq$Tbv{"Zםdiy/D6Ytf t F$¶@\FKɗOf (yzڣDNJcL-5v'51߸[7o>J$OGi};}#o_{&4Ḩ{1=^}&jH4ūi4Z-yZ.N?~"ꮥOK9vtb1"k;{_j0$!D%2j\3bjceX"RT\S DR|Nl`ш2HgA:L*[nBB͊64|w5p83 w4gࣇ1ƴj|jDSK$: :;mmS(@ԯ~$˫g7JH `MA`^Y|JͬVXy,m 812Ǥ=i2 mp2*ԀDSYzCd(()q '+@^ufSkvW?3[g^'eJKm6L/PW|sRAF2j5Zz?[r$Bh*HΑ YSojJku*vFD`&8R#u=٦IBM Ņ ePV?{Wƍ4_&^*I§5%H*.u5C8E$R?X t7_[1D"LUܫr§""D[A.'Ph{#|]Ȯ;E-ă,53yg䍰R>(: Hޝd4t i@\Q4ƪ0_tt!Jt,Ӳ7<+_tDMI4y!QN }%d!X,*s(Zxz HX H Hh H<̺,jr(t(!-dȃIx'LNtrڧ#;[o^M3Ƿw]]xxݟ0S `훭_{O<5Z`n7aYlM܁1d*C ` 8,1<෯=\>oؾW =3ݲ&TJ= H`8ƫҵѥyO LkXA erybL)#@zxWdDm1( J&ϼM!pG6&1:Dh>ONlVl\u#Eg!Vbaf1g8g2h2J6e`"B2K JY=HLܩe&GY,Z]Pg''/&'ǵA GB:k3rnv3h4mL{+_=Nφ17gDhmj%b_);#yRԪ2BUeQruG#=B ) 'e>zHcнl3ْL(E"O0`#")fmQq.X4L~s(=| O:Ei7X4˺Ak{/kJO><`TR&N⟌{9'<8'AF Hh *QahB*h;w&W^[KӢEf?01tQw2NR2w #{ %^ilnZJΞg4'\ZKVL1i.[trwR@w(Њ[ !'!U(5fq*j= ;aZU,GSTQ?/b:?Gx4zQJ*2'|e1;B_=d4p,3Zo?-Òx$>wv:\B˯4=r%2C}кO7.-GHk3]W.-hxKȠl0%`Vi~ق70'=2^7>wEtYvk5T+8̎-X쫿`:?~]MNh_~Qͭe!ť<_MOX%JH3;K g|?n<Zg fN8(./F6.H`$ ߿kXemBoJUdWїp3mǥTiI:7*3c|jQw^+̼ZLq?Ym4.-۹KhҚu]r ޒ($囓~Zց eIopwmÊ; $;Pn>R8O%}ؖv3F۸pUmҭWCq?`3HyxziMӖ)/ؾm2`m:Qxl4קvKGF] yKGڦ?J /ܑWH=S'w67ꕎr@έ=wu|N&V͠\)dƤ>OQzIZ*%KgReަ*i&LzdEMtS-}~tLR8?-h |jZK[:6q;mz,t&fZwQCwEsº;N[+ABDQ~a]fGmeqWD^*)Q ty¸Bp[G ~Men(ţ4xl,)T InjNrƟF]ݍJԕծSRWD0)؃QWDPQkwu\E+W*x8$|$iX%'ǃW!9 0y,1foo3fF<#+L+ka%\=w+_NSw2yG51<P|SU]dG[+#=b(mu?|3O>a: -;|̛+!}yGtiUu`zyʚ@mL?y] @B!4F}=S!~o_ɜHǫ1W ,Ϫ{M}{M2j'a' YU`W謪*.zgt@Q'WUP$J`*:uUUzUY/R]I&VpTî@]IdZؒn*e>|{ud<ӣ7zoSt& ݃ϥv%:u-`ƫJ |{㫼?^}Y9^;j^G~i4ǟs>֞~F9M9EFoov[uE} |ǫ]*.bۼeykZ5\k}k&fY6h+鹩0Nh#fUd%}+ caWgU9kӱȸ2Ih 2RŨtm#B g(J/a yt䑇SݖfF r m=Z"XΘ1 ghKxץ"c ]5 O>b] d[Y* %4$̷jq7DEI'kZv .JNI*iKɘcPYfUtΨ G+Pv*+L YNp cQ&PxczG/BȤbi5'-رtFΖn 쏤 1+u^Z˙TK#gɓa_ @DqI/_PP-Th@\pl}D91'I7֒ohhLK8f\2r[hYjƨ>#@R:ɓ]x:D pZ)tL #SHR$rֲ,{ tB[ڬ!C+lVwV>9݇PG/Tx :k*)OfhN&*댩wqW(ȤhM04wvvmG3-Rc>[g'y73NJ6JA:/\N FKI ^eMLY.| {b:s#w;+YiJ{Pe(n'i ٽ]0|~ĀI3G0L2RΎ eP>d!K`Au i *I@!I'wo19<$$$$ 2z]ή=ȹ]PS]eK!WǗ ,:G䲮vaYͥ}'gurSemVv{ nyq=ʷ=ҒLn"*)sA Fw& YCl .d:Ռc eL̂a &cK䒓).eRR]k"k3[3]g c](z]%ܖŷf2nן߬ݾ/<&~ *7> Oe䞱:VCI¢W)XłOBJX 3xVc #20gE`{PԦP:(.&2v%hZɒb޵4#JWG̡ӳ=ӗqOaSMjlGD!R)%R-(22hSbWۏq<]M:jcg;IȚ%$/F! ΢I) u γ̹=RiW_p$!nI|D #(edꤝluq{:^0E&ZDY""vqăeVD #0xry`2^%%r\"lRU-51|$JsE.vƓL*)!ȓԅ7E֝-yW.煋)YMJv]T]ƈVr293ge:-32IF9dK̾{%bnp&lAUp[oIhGG ~6imIp] e?yL|G W0V/ Uz)PoA%~$kߜqj8Xn`M°Zh벲hV6;l8l8l3W b'Qe˘R&/29rʮ}H㱽6=[fiz otm'7vm< S\mnٌO'⒮?Iw7tw;rw3{s4#.l;s}^˷7?Y٢;7hD$cFc[QY4fݿ3 + ќ~.7;?|{sF>aWal=.BDma&*8'^Zc:#xlx\ γRŁl5!Ml6z[ |.1d42 9hfqO7:;]ZVZt״KlX OzNImqmS]TJfp٩V@s \0RDn9J!DE:c"ӡ!cd͖^0dIR>aƖ l֒8WSHe Alz#?kq+N8 ->7WB%"r&MGrJdIzg1RZTa|BJ fr_icO/Iu~q&Kk\LrJpc8d'gZp'ng/MCm~|랺vOۧFFHy$読 y=&{%"~Oxo4xQsy-ixZaK(֖/KQaWz4Gi6[ &[׶W CU$?^9)ɟ~˯_׏?#~?Z'z/h:YO?!Ё{weG<l·x.@!y_<\ؚCb7 .. unjg&ns^3zRh /$upEPRath%^(yap{ܺ4|?j} [cMmgik-]Bmsŏ=7*x5\V6jԮ|㢏MNhc ;O='nr3l;Իd!+ޛw]ݰ m3YZCĎ΋$Mw58Z=r\3 6f+t^3}pnEo3?lI\\Lmִm¼ cPFXVX2X$h8h0)lL0e3+ȱ ~phʦӎ}yTZ>bH^J0g<;Ox !,M4Id 3&O=¬hSI*{e@>oEB˹=:# ZyWzH|P\֗~ =T GCRf,?)3uу%0nH=%C29)%@MNsins~5٩v-GgN:;a ",HkOUdZPq og3imL6)iOzV;[:6*]>)tR1B3,$29FKRLΆRT |y-?3[GS&p53Bq&%8x_hf7S!!;U jɠ:?sաI$# @+V2)9FZhJFIi"LvFѿRjKg8⫫&n=XF,RNl`J&9b,DAU0rO^ՋP iIi=>%"*|r1[,,3FX)B yw^ &dcO23ӄSwQO=`lQv5jtK޲(3 8`[ QP ] %@lN~ڿ/} `{>+/O/W@P!q@LAAFq&4z bs l HF̢hW,U#ch]/a&9, r͂{dF_;*3/-?zۛ/Ȍ[ DSu2jf9))S9g*2CΌVQz{ ; Ѵq59˷<W˛6{5+(v o^ȿ ?"aeuxzvXXKo9tac~F)r cZG(?\};mɧ"/.gf ɤlPQDB$ƻ~(؁$^v=$ԗA)cJګD.SI!+*iCJV> GA4rV-F:9=>rl&Y >C}`F9t}=ٯ֝-jv6v̕\\]q^9(4Z8>w|mGAZ!UJ>* `2rɹPx RK~TC||IrUi(߄۪[AշUwvBU>2ƒl@A k0D;XA 41_hM"# ݬXWZ ,ja"znt8+B4L[p >:22MP*tNr 6ABmԒ,hXl%LV}źrԧxsWHW+h9;r߾o*^v ~} Iafib|>Wt356#Rb 560gzˍ_i/9ݬ@vg'@fɻ+J`-FzVE߁zzq ʳހO'y `sp P6Ekc>`W +YZzn;#OEםs @y- -s gL~R iñd-æFQ1 %2F ^ KW&)24% "ibH:F8R2R$šA8|qnJo J?./ 1V>| g}o-z*ag{vrdžQ%uVLBAPBɀa,f\Bmu YB4h@gA-? rT/I턯)'D []t*0K€o3f|QZZW!,xQ' <"s$/vV90Xo%X{7٨1 Lxt~鄞yx,_JuL K:¡ kQ6gRdt˺8t8>2rG A(+dz/٨R"RFl*/Lk>A{ۃA;uZtPcBE.r:{_ī/ mO:oFz*kQ1hlmh cp#`(㇤m6f.>NlR2_&|/xa|Ad:܏盛 e?9|hݖTb#QW]U.ޭe+YdPQQ&X50LO}` Ll> }3=VhF!b@jRRZI!{̭.>iwtq~,0ZS"LNN{ELq˟t\EY 'ELR'BR^ ˫Qdd9Voǯc( 5r֟h_?/դ(/?{(VKGN'ZF%:[m6[=jDwB7U`gOe }*7U lz͒n i׼:2xY'o=X@ ],cS M,A3e-n>XlY["fk%m4(g;F6!H%ILvZRk@)D:XFWaQD ނ!Xe7q ~6vD!@YSܗ6crYAyY›c_"F~O1YS, KSBFepĒ.QG/4<.u˳ЮezJTqĚUBP9h-P(P|1Rb2I%O&Y60$k_CYBZؠU`^j%([zf8z4M"k'(\=e9 ~Z FnhdShxW_zUv^Qo*%=rXe4 QT^@Z(}(%X3>QYKY ۛ殢q= (+@)l;z+%Du?5BZz3ސ(8su hX$:!M{T(gKƄ,wg=MիߎIFJB_BvQe֗RE`(Gdfd ^ 7|\U[YK_y{NkY+ )鶴p>S9ah&zZQ8NQMւfp#j紨Oِ1:iXa5ve^㨲&ʘ%zSߓ( xmZI)TP\ ]*"H΂R1 EX,q8h['",injp(__KP僞oBy(ԨS:ckDXb[wAЉc!d]D< a3Q=nX'WߏE;z[!jAp1^i"Ͳ.t;iD;Y9$ &l d5)'ӏW5k%K6r{pCIV{iH6ʋLAynk@! = =߉2Q "C beXdv!A4)% o/NwQ|x0٥;xe^=u vgp3>I]zs-$8w7lJǒ\O33٘[?mMGWJf wlDhO4 R$H4 /FtR#+ 9|2$L)z.^ 61)!Q:Q *ĔE$H5Ȗ"tk"/F{d*(x>MzqĘ8 mYv㔡vŘ|\/&23<~"mL T="%μNފ 3z Ay{Q%{{ ;iQ;|֋ќ?hkw/g~17??'*mwx|p#==k:ȈŪA.g}stsP;vG6AJJZ@ VX&q2 {/iQб+R`j-Ti_yg">>w>OkzHPc΢QфF(6BCmb 5 pDp9FlZ,19ȅd%z L(IQU\$PH MtIf$LW7 BkM o5~?&'1dCRn;p|F}_bjKuqwx[]OTT`$Jp R+>.?d1<ؽ;nV z5Ӄ)%ŪVS( vwjGʓ# NSN'WWyzcٯoi}a]R`#Yʘ|DXX^ާZ$T<ǰۗ7?˰=m1݂xZZ{g7}+IE;zv$%FdTP%P:mIgj!Pz"*/Zu+`>ǒ$2/I!BaBVHRpѠIy%HR`3w:"j&/>89m}EE|Y,u,8I:r1K6b*6+x" =|kC|dUAeJ E2%KQ6R:ūk}^9޼G RJ& H't$"DZe@k-%Ms&:H}oRXCX2чP]D=K)۲ uAdOR ^x1hs1$~̷8 ddY!@68猋 zEʡ*&zN,χC fYlXW8gڐiN5`{&U5`łq H%9dz߮01wiTc;l]"^v=ގi|܊Gl݌WFV?Z]iֱE"Ul0`rX< l;&B Nd_LTw" dUYr *\t;b_?xlˏE^_T*+5`s"YU7*[͑-  믤@c3)elXn2 uKۈ/늑>x_9G [-%hGX7mNcZi:ڼ Ct5`w E teyxcΧcwKؚdT Z <2Iq|YA,;Ltlbn~HI1I bc? )}E[Z);:]VJD~sg@<'>yaZuCD(dj)2 Z65Rغ4czqN,~z*[tT,є58$ )N%]ڣD6zʁ7'78ӄw;\g gRJ5KҐ7JyV#+r]w_ͧs5!SZ$tM4P5$Bl(grRZ1dJ)SZŽ+cYD-:JQBI'mI[ (6M~+n8) >ZK)/TLC'NJ ɛ`]PgͶ7rc)Y6\N>g?~1W.g˧*xr} wx^żIs;gUYS/O=mlwz׈Jb:Cn{-ΗVRΣ},Xe!'oq\Q|"" 6zJ2&T,3kTBC9)wqܾmlǫYm~3B+vw!;uU]\O7߼=^zzflb i)gaRq$S@#hHo3YaCmD(}@xd2ޱ2$/,+cKdӑ P0F |ceuRK*cJܱCRXW8?hڥ.3/9=΂ tX(JrPGqBTʵ{L7;质߇0wy M-尗Uw6@x>]w !O;_v:V uB N\J'J:%V5w xGλ4gjO0`y%ƋlȆ@z c Z')Ab\!W:BZd$% !9'ِ 1y j '/61NS, "@P'Y]4!Y'e,3*'ѩK!Fyb){ߊJxe (>fSP󿍬JT1C`!qJ>Q}[Ps[9EP7>9::ɤ7`WT*d63XUE 5@dUD0hZl^"hG=-}V P4XdBAZWA(7M%e)HuVH9{X2d) J&e7@ϞJe7mlc&%_JFfŮsE{QR!z"ȗ,R;|ꌾhr6+.\[N%`GW6$,`T p`w= +|y^`(YX &d 0tLj%%U9,Mm\ K*A)f (YlFUTf4 5E4* `|1e~`bVy]5W]|#roGOm\43X>@cTavC1}Fey=Ԓu +>֌JDp&B݈sg:lA,_fqm],"6 Efrt{ސۢo/;PiDqz}jdK?,Z-$'i :~jjvK.sCpk;~W\oT9>bh>]o/xw{UKb+\q<{(bVtAͽ@+ټu2{ ˬ2/,)'-F]o~ѳ5//oL^&S+f0DL36;˜(L]8iD8J* ғ`CV)LFKp(%DObN!)H:b%0liXYKksj2'iGŦbWڰ,׿^]fAƭV"=ȱǶ8gk-'WzB561oM cj⓫$Jie@ 5trJh9v NtJak])j]%ZJOtJc9n]`APk*-t*}t(%'V芢z]Oy }v1mb[¥k]mV{-QJ}TtEQ}*=.]SE"„[CW"X[ ЪC7t(5e'ztE$soEe9rkgq~*4 eie%4_-):◜^" J7>QكΙ3%QHk&ۻכk-t{E\;cv\vNA@ePS5xTb5OK_*.P<^'O<tLSFd<ΛJg;ʄl $mBG?I( : nTjZDW*hk*-tЪJȉ ]1 -X.Ũ-t?]%gt9n]%TZIJMOtJ0\F }R<V E*spWӒٚo14.d &ceѰ+V;vؚowo ٍ(3g?L2ghƏ]-a}V*:;5uוh+ڹPq.NwGahFɸN^#GgڸROӭ e痦Q=>] jށ͕~:E7yc`/BILu Q}H++|&]ޚt;uBuUR尾eۨy82#gP%Fq9+șn1*mnAXdq@ř>őϟv8SoC9 p{]7=(cFt+l pSƦ&x)H=9|RvzN)Jp0IxY K6 or*LwN"\D8f&1*&yNwoL|.{۵{,9–p=-FN,oYK|kcD%eхH^b^"ɓוa/Űuyv\n~~2S}B\e Zt@Nnw`y:}zzQʼngk4qAE3g&2і\B+رJO38#Bb&&T{%Z%%EDWD=tRJhJNt J}pW0`0|ᇳNǩAXgK_n̂U1wU/.Ȓ ^V+ɳg8w=>3nwdP%Cwވ %9[yj"D#rZŚYA'W3?>4ag}i5i7`2rGEt-V&,͢%Qᠤ,=QY%R5a]i f]Yc4x4#3Hd\kpyy4#UEmk'8  &|"p!Q: v;$Vq$\@B;q`U$S*%cRDhL$xǀY t\(w֫\0:{ұc1G̐y-e{-H!0Wi;gRZyұ#ұR:ߴ339N38sSxZwe?4λ;??w5 Y=fG4-#:/R þ( gzN^֖κ>'ɗh}*{eJ̊켄>L&1$&?RKF/L1FTФ7J؍_;}/;9җKb/5nK&kB '9՜1ڧˋTFk,T._{0V݁NQ+˜R؀7J̤2mCTMτύUuܞNٴ ĭaÂNE$VA'Z[)W&hb)r7;cHjXD$qCvYPI(iroet6=k fE GAfie7|eOdRIi7NN$>TG PJ-5yTlȎde8讣DKݷ~Fdrob94gf"}dydf0JGpM0SX˼Fyx2k@̣cT>FQ(֢tY(#Št,8>jK ^Ŵ'!xS3f5֘6&r!k ʅ%TU/VC*n7 븺2 >Dh8cG:8]AT3+9(砱R CQ>hM !M'ٻ޶r$W2\d_lgӘFc҃|j˒ےZÒcɊLr|֥"/X,b6AHZب3a%&*'ł6-v5q~4KPvq(V;yfYDHM !Xm4Y29,B̀kpdUaJr#!d 9$M&&ˆ" 9dcI;WVn{؂vُCǮQVE42+腑Hes<9<0/ʒqs+@PmPE346N۬4Qbg< @ Ns E&iW.r^gY]ԕbg7h%'C`Qy3sV!I!A.ӂL24W!XR2..~gq(Pf`%54vs7I%E8(iEE\Hя藲'ɉ׸3Iɶ6ДV ͹b]3 sd} b"BE1ʖ1L.Td621BX;&úmw-wT=[~-Fƣpͅ+Kgf|DbП&ntR\= p|G2AƠ7f[!l|oj7 5_5/<JF͟nmV!+9Ka3vٍ^o 1XLyE%K]2 ?3Nt7[aG'WүsůH[C<ƣ BROQ5v|Xik5cW)Igy1ij\yߛMoSsagA^ːK ! 1;=3(˙ׁEdt67yl* %dtKa8:Ƣ B>7I,Q l 㔿2aHoW֭ܗ\Hko\dH\Fzqi" En9IMG_˟9}.`%3JTB+As \0RDn9J!DE:c"t]%C$q%,-2e`%KLQ[N$YK'!O j[ F~kŎ }Yیe NZo ͕IJ#r&ч6S2%GyJ h35/\Oڽ6{+O7œ&0|ۻ&׸#n )q`h!8QflQ,9B&chKkU\_?рûEܻ2ҰͤĹz\LqsؐwZC2%-Sνxq<9XakFӲ x2JQ=\t2qEkirڕfo{+}MmFݮ˫WٗcKœaONc zl6h *|x ;H5-q׶5#71ᨵQ4eJ>>z|f?9dk[lsNkum_ װhu\2\GJÒ_Fm1ٯ+;Fӓ[+5ך׺8mo{?_}|9?Syth'^m a~Y$^Gc4<{aVŗֵJ0f?]{qJJݏ~xcw^ݫ~뗴$Z։"^Lw_Ѵ47o nӴـo.@.i/7%p[bȇX/c9>[؈V[pu@I5RgB%F`xaE( @,**%q%;6|an]V}>ҮoQµkJ6=l,ywdb lB-IBtR+E(BDt'6bL9sA˸`ѐZlǼ+ -DoNSzy AlzQqZci&TxȬ"^o4ȭD&^cx6}`EO=P~pʾͺn;y֯I+!qQq ֈY7+0m, bJ/T#lؖ}mwq,1Djo^컓! q?QG*dNҤqj}7Z"Tc]̍`ֺ@ԝ{Mwi'USps[3oE`pyp^BhŶ3^.ӯ턝E_톩h#h~.50QeI+69Ke'2BXv=$K}$ʈ :#C K*,sUY[VL% &u#I ,B̘2-e 9 #H $n"S2#Ieԥw}t0!Q -)_.ܯp+buw 7>> 7!X1 1BCJmQ)#H ze\c"Zc|rc B"\ pKL1 Wn%^x]̙WwHʼcާK>eAZ}5[oEB˹=:# WnC*t7x2Ի}.kNʢU.i2cYK0nH=%C29)%ZV&f4ӳc()Z$tv AhES&zH#3Z?W3ZhBYi"'n$~Ld,<NL6ߩG=X8ctfȮ!*kxW㲰7Xm4mEou$J< 6y`V'*uId^ kV)5^!߫ZI7yŤu7 3. i`U#w^4,^vW|vf/ؒ_S. +f=ϸ̖k P0. tC"#dA|Aށ)Q{Rr ]RdYJ6#WBA=Ƙk# rĹ~޴BtKvm6v̼lRʓfL}Wp'9|l`)Ƭ{s}V^6/?+8A)K}C1 &řS2vN} PN{h'=$#fr+cmtYrxtXZ#@X#?Gx^>_,|^uE69 8_٬.|ЎX}G7T0 @ppYG+S,'et vJ9Gәz!0cdUw} ; dz޴>kB<Գ6w/v-mo p~*dY<'lʻ >SI!+*aCJV> ]"Ӱh]ȍ F:95 6p Yѡ>0BRT7þ(Vg ]>ce'sh?7bo;piGqC{p(H+J;BVt#U?>S''J65vwUn\Ug7!6MeVustUw\ՇuU5KFeԼ JwcrFgj{8_lr#K:%@r{쇋aĘ*$eYz8H#"4{B=:ğڞ"nOl6KnV+ PڋF56;&ؠx$g%(H4+F tcG6 w*PFFC4Ry΃lښK3J1&*yuȹ_Y@ǣ4%Y<;XwiZoS6vEoY>e02 ɤI'SgTjgytQZa-i!:p{(-C˰LtZZ UNB^t^'s ǣ0"AAn>%B Ho%)o$x-8IjnĘY {7cMTp;K4owӠv .Wo}mz(y'ur܆2RHQzJ GEL`B"8뜗2/[}^0P)Nbs{ 0I2jgR&Dν"t*ܼ% GiI~O& >҄ N36+k{xFi衧zzync%j3ʅA bqV2M(Y,$$,*Ey:A[?|r#н[ꉑޫ9I)EkRy, yN/Y"1׃륾3vB"Ent8de>R~a^-H_ij7֗CMg2EystJ)< hڪ|eҌ2h$0=R.z7sr;Oꐒq<}9j9`%Wg$U*/G0?oPgzT-bfjҐdm]΢ovI PU蕻|wc~{FC7" 4d/HOGz MV8^qLKz7Fۺߛz[z)jz1_?)bE.<Ҥe Iكօl27`:ytt~^G6Qe㖎^F+ crVR%/HsV'<EV/;-ly7ׁظ>􇛃Rdݚܼ!QIJZV1<{[S9J`dR4#w׷#"o=~lvpZ#vrhtO_)푣-&\7[<αp6n&Wi< Ĭ6E5~{\Yc.AQc.wlVMstwމ5 $_ rEyjZދ{:6a3mz̑IE6ꎁ}3`ּӭeDt8qkE{.q˝4NyPnȜSB/ѤzR2X!N=?!Z/xLx=C!"# P T`2ch[1" e餜ѯto8Ǝ in+yqz.ڱ^XunJ9mRi<(M LL҇ōb ODގ\2B;\e*589 v0pj22RpJ쐴+$8rr(pd*SW}܊`U&ġU6Wʥs^\)JW2"՞UWH-%{jTwJS!E!\1UP URpP8; B9W\v0L-vgW|WϏcOڎ\PO WQTܷ| =\0f0|*ߞ"}SԿ_)4p%'o:ʳn3qǾZ;ژ%ݰ S >yҁ3%(K, WʲL_'9w_ߑG=ET}'Z#qO1\,&ϣUE :BD5,X":Ii(ծg'a|5m l%fsQn,(G 0 Q8H u+HI2:D`G%3;|eXVhs)#jGniR&J%´Wbտ>2l0 h.TQCqQp065Y2s͗ܨ>8+v5̟Z> yuq 2V|4@2'GT1/crŽщSdh2E<;CS6濵$ѢFw&#Pm\I5 d?6wzSPyds!>ū@Ҭ_M5ZBh.ɑߨRhej(?ek6]^q[H\@}(VԴ.fs"RvS:{crvN1gˠJlm^% ZE;k eHm$G:k3uvefyb51 WpVˆQ5/z`zfE֎ݣ'6j\5U|1$''st!0ש.&_9s_#Zfًv%V~8u(~ &>g瑮X oEPW \l7]Fך*FSȩϦ8f ?8?E!o??Ǐo޽Ho? 26(<_DݎТq\&C.g,r[Ƹh.ןB4)CxkټĬj[IQP$w&: (o0/Wmr8C-pKYKӁ5´YBJ#d/1e@ KʔRGd Qjx&%!N':{U3ة$ $ˢ7yšrkEN֨U};-lC)Dw3)0SΜG,<F/Ly>ɓbzDe>)S:L`s*@}7g:5qjmQ7aȆ룖6 #a$utI7eR\EUh ;XځXD, <XC#w&@()Qpܠ%ShUֶ=6Y2(WEi\ʯGPpwP\ƪbE^bqk^^}FUNi_Xh5znxK+v0]1ڦ[vmg랴ǫS_X_f?-dn-tK/i3VinJ߫zO1="Xu0Wb2Ox;j+1J+1/J ephiؓ wN?;QAO wݚDJι+Q!:M3[]eŷ$p^`FZS 1PKwQygLD'-Y>n{i_xĥvP\4bQȳd#bX6VJ5\RZP|hX*0AI ?ϜE$ P_HSKBHF?x>_F]iN즗-wD-0NU/yDIbF&e,w'l!H6s< Bq) b"Oe80H IVHu mm"02l/ǣ/W{ *? Wbx"yZݬ~qE׊J+?ŹFB9-r&F6 %ZƸ ,:C#'` 1U!%&hN'PP $KX2b% zֱwFBкF'a݄9KǒN'Zx=Ǒ2#U59h}ٽ[9ZU7[k,S6R C$&o;(Qj{ڷu\:rl33c#K:I}xtWٽ ^?sv/a@xbW' ߅K騧[uūΡ;%:޻܁da<$u nrkj73rD=Xx!#jr%jKnYf\/y}7rz~6>T}l4'8s1ō;:TŬ[{Wgvx\>y6|WS o_jӲ{ ݗ:sĔB+mzoF ֠RPPM]F)fBAǫ%'=緷nb`_*@;Y9_)@?-P؋N%\Lޑ]gQcz?@6J)1{Tbp[Ʉ,ؒE&g 6-lM*֤Jt*-LhSOVhF:])BՐLAQUbeHȮ# SQI+ *d2)+$ځR'8Q 18Ut S=M!u& 6 9dDb*׬M"z-h.U,-TM*$e+P)ɃMZhEPP!}BP2{jX5y,Ɣ D蠋\,`ʟLW< 'utQ|9-+%Uf-T =Nr$O2sP PR5XxO&b* {:e8DU((!?qn7z06>Vm#XJ`tB2D10 GF’>d6GBІ}PV9k9Š>(|m:XCz_,\̦ MF~THR|Q**F'_ /rd;]1lP=!c59C1!x7"%N}r(MPW'4IofHr"r bHwD?N3{kF0Y|GW5;5&ǴӞ9H:՘XvuČ1Uf,PUנGH=q[?̭Uglhdg{|Hd[w㷲{@Vl\੢6Zy%cd1֤T\U Ȃd`è#{܂!_|F!r#A0-Uב4MEPr$ʍClZ&Ѽ@~ڄ}3t PopQ-}/\$ }Em"g+\$OK//Zlr'gG6G?hD3XRiH|V)j n@;eCMR;A NnNveR"c$1d$2XK=q1TXUR!jW[.k!AD\uJA2wny\|((ò2g~0ʼZK+̓$fGo/LniC-(tsuՇऩ歱lަ|:|o*YS !8BfHFA ?xA R H4R/۪c`#.OM^,̇k5P콄+߼~-橻~3"?Ϯ.Ei{ioTUej(`֭>"\EIy\j㼷d~jR,*؋ D@ʻ,/GrTMHonCޙ<ɢyپ@1X5Č0 r̀% I[SVR+m n_;n \' W7h{8/^ws;O; ls dΞVCϼz3:;HvKPKz]^:F-=J'HJxwlv;VlxGePnwjO)2/<_iiv\liЪ! b_/ bu=zٻT%g QbQdCdH8c:V-&.T (WkgbZ;_@tΆ.A߾Q4/..&g?vfeYB;<З6K./ya/ ;nXmm ad  ZEܪlX+&L}>F*m.R@5B.{CT5yźMq^__ձ|[,MkGtIl^1^ -ο KM|):AI * NNB u#? {Gno^>>֠`IZh}&/`Q9z3:cD bdf`3ˤ7?kji+{>W6}+yp m;W@ؿ Aamk ڈA"k\M&J)8Ig>$>`0beZ{Fi̪D3 ThdB,q$j 3D6י |AOսeX"Ȍe2Y0WS2"="lvs*ZM$x˫#DϨ${<8kIYXGEy҅S*¢$yH)4}A?Wos(6,k(i"c22"ZoZzy)ϘK7q}73OL?¬n?zԓ@Qo8=>Ai 섗zFˡZꦫw."}VJfB.d+BKP!vE:SD#BPT6" Ef9yܥ䲜׳e毤,ߚqHf;;6uQc;ҭd׫AqjˏG A.0֯O ܀h*Nh/WsWWHG| [G.EGiTh,GWh9 *H_tu@m^wty<mժ5s9ln^WFq˰~%0_ ;?<yZ#qH9i?=ߘ?nQKſȌ=2]San쪜#Y _Ύ_?=(WPDvh.~wiv6{mKo`mQТ3O)75Ghq I~3iÛ1c?&ͻh }_3ЍbpœieIoa8qk\DŽ?x6 Λ.ӚYf$k-c YUgP5^Tz1C)hז'{Llt|<>;-+&ka*cN &c HGS 0tDZi<$F~fŎoTum3| i?D+^>ŀhg\ GR?PZ)n⦱78,:<y>\g5zPm\,"4Ԥ+BƎGR?_j{׵L8WW:z= 8UlNBVq"+ib5j&#HIrIM2a8?9NDɴ,m&YUZ6 =7&sl8ļP۫ۮ7]g0;MҌʧl@&Z+66ȤIM$'c\\AarTl;Jl$#8e8RjEk%b|˲IEU4qMJu9QEN)/Q<nDBIuCs(rRٻ8WyJGD6<;o^yJ($G]e_lf7bX# EuUW̌xUt<9W'rP*9T=Ua8 bq*]e.Z[2=΢Ԝ#e*ƙXiRUKDNiCJ6DOm$r2p+o'+pBľƶ[:5Y-vV/^ߓrkikopޢk;9N3YG|9. y9+E*t^AQd(HvAA$'6m nhjW\ʶk+]d ZQ,3kXF1eܕ06{5m)U9I_JcN!F/TUmfj ^XiFez6{{){H8wEƛ\lHB ʖ\qJgU*2XySB*jLaJxT ß,od*$ViED6oϮ/8,+wse:Շ&'džCzՑâTe8*dfqou{qEZwXB1% R*il{%I"9q3k 5j.7֜kq$rJG%ʡdW[LBFS)TTDfƅIƱ\H=sl6xǙj˝ ϮoqƮ'xb\IW))V&h˙zӕZm8{!+cTZ5&ވq(؛|ˈ3#Ĉ#' c8!,AL3:r2;Kp\ZRczeD|¤J-x&BSI+:*Og?#S0xN5ޤX^=xq{WJD61*礲.X:9%X`IG#,+5N$8ξc(|h'=$Y1sQl8C2b )7,7яTˬxiuv.ૄԋo_8YdIIp_]qx`+r }|k2Uן8,/7ǡ[ b d:$pBGp6?cwr/R"ޏh)[ ,%6ƍe)uzK R \J4BK)GCW c֘UCdCW)GCW F-вCRDW/Kf;"` 64jhOt3^ hb,tPÞꛡ+z+}Ugtu`o3ap3/ ECWtrXtEMtuKҊ~k;s;tW(w⧳vB]ov:OY =gf/of9g씙I:T+c^6;|Puv P",? ˋ+vywNxc?}xWj!є{X^ jzߨ3@;?pW2e:!v!euθ< @hW s9 QebDt5jp ]5NW@ ?gHWsӪ ;#GCW c+btP*3 E 1YWL_aU/\<jh J'ztŖ#+Fx@<ZoNW@) D2! 0v4UC (Yډ^ ]Yg#+v4 ;=UCOW@9-Lrx]YjphZ3MK+ɸ1 \U;ghs= VDWc+~+@j?Fg^j? Sڻ0f`@W<ձC/I;^fҟ*w^$Bkxd{K0חe}aUu}*9D5}ּ>yLȫ԰|^_\^ߜ3MIo;<5UzTcS09^z&(6їU֊]-8ˬ4ѓ"p5.HdЄKUpp4W yy}vռ7@~@]λ\~_å`T 2 e5Sq5Z&+UN& 5IJ5\Gh!tE죕B"dP؆JYqcT&cnWgKVd)ljTI9-Udhg>[ a6T"J%S mQ;ђwEc>RB)P儋NR2FIǜ0^Ќmfj ^بNZt wn/`R풥ZSJH),eUdaIqH E=t!!>0X2 r2}cR$%J8_S Ai=fQw4Yf >*!dҢjB_&!M*C: JX1T`H@ƹ Z:ZWp~%\M<[o b{z]F OG0[nC*7Y8B*FKk2҂}G|2EoD6cys'ٴ*I+JdT)j%xNL%)ȌU{'1 e J,?¾^ɇKMpɽ2M)Ǵ_ak:`@jaZג'XabaJ V"k/ EvQk.&Jeb0v>I#ub)iG"dBU"lJ;#kf憠Fk E„gUa Q= \0(+m)$eUX46zXd<)K6bn0K v V/dm>ZSC*l;nUB^@dYeMn]9 ڢQT cmVڛw".L^٠Y\Z6&-}5lIՌ9&G..0!q`[LJʬlsQ С +Z%H@rɱw%Â`KFt 5vsA>V S嬅.WI % J@F;0$ \8I d>rNƒ["dUGX +uW|,n8 Lvt_^a %t2fjwb2.!;[v@8OedŠhA4!Ds5FeAx0Aw΂GL n  d $*dSm'uٻ7$+^W1zX@/^E"{lځ}OlJb̰ M2#NSUO49S8"C %XVGKе 2"(QwPҧsC^AJ./s1&#Tuo{5t2MG"<'6CܖSBGk .IҎa:9@1^!B {!=f br߲=V>/ hDhQ:fƮa6 sRHQ-fWbd yP*Zho:"㬪ZU#`ypRBHcF8&gو^۴%VmZ4>V IGH$5$YI$eoڀJM.U{9?ҒyҘ$a:J` ئ}AߦT%L0AlmRih\> csm-f],Hnsv5[.ڎsu&NA[!!loNN44z63 >{$;-,fgm5EZSR%9O%O ]z31iOe:OٰwRbpл!)C^zHmk* FD>M'MJTEJvP-hH &Ш3) @)C|EwڪGͰ%6vł*'+ EISL1^MA5w3`QO|AWLF/LtB)j zf,(jFb1wH {yz/)-0'GJFdUY[tڠ@aǓ&X©vltEkaF3^ 2).$6O34%0]2'ʓ +ZOQ!,iJ C\1Fnv5'E!aކ`Ic6ՔWlKC$H0r(:fAjn]T!t$a. ZP5uhD;S`]pt V/|/r IQ5b/Rd/C-?)y :T0N֒Y>';%!&[I`7Yr{ 2m.i>ݿju=Zݬz~{La7ɋj#ZWgϤW4B4?c.oI#ýi7}<߭6[77/n@'o|sܜb=aw[ wRީSNw \Eo^1'8 ް@ ?h*v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';N & 9:3' y@,:9'.J# tN @b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nu!\H駂 8_{pO;. d@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v(F5&';'Z;R{v]Fkv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';. zm?@;]8sqpqhÙPơ+w]9N4Fɇ[ݦ mϓx|CN'--ng8[;z*?O'79P녛;5E ǟ#o׳D&'y&λ@EBlY7?I&R9Bj붾UW({ѽ꟯Hڿ}pQtuv}Gn.y _{bSbo\OSR͵>) %s0"ah ]7\ƢC rOݰnݠRfLtExRC+Bi5ҕ!]v4tEphК8t"{.ښa4tEphA5^erR FDWxD J9"F㵫K+/j<"` ;"~ .-ˡ VjDtExAXn5+B<UTUDW ]\oBW@ J4t?sPEyv: w2xoMtuZ}L]#3]=v {mFCWת~tE(cD2D?H{ ~GvM05&~EOt>d/9⼝>~e{ɽm/j=+V_浴[\/He'g?/5E' sAL߿m 6ccKrĉsA_b z|r Cz+K]ޚmE OL#7C4}g]9b&~FeͲ3,T~fnc/:uL=4(c㫏6LGZ#&##p'w&8"ƳNr4tEh:]J OHWF GDWGCWׅqteLWHWH?zξ%8f4/.qҏI]2Y"羅qh:]9z]]"]yzxzzx֮nͦ)B%EUA158G]>CePtuttJp\[\;޶*U̩Wh<3x`+g^: g2xJ=0_8c^*"D +o!7=9Q -3noL[c2܆8gqpnZ5|@(Y7\nP^ GDW,xkX~[}9]ʡ(b&txqSY!%Eҕ> ?""#j n4c+^:]ʡ]g&tevL%L ]\BW@+uOP*K+pqDtҎnrtW>1t"F1]] ]fLtbg_$ ^"]S3ʩb,tZ4*FԘ^$粗q3硫SD6Q?Z{mGRWǡܻ *AWS/{g/A5r* :8d?h/9⼝-3M'{t2GRK1j󄋣s;+sDv!lu* b;a6z{]Il?x8}uBVyyvn+)N7\uyu/˗M+1;4RZdx,KN} !%-DϽ{Jop߾IYFz|v>zz}]L^_?ܶ{Nyaw se={|b165,UP S˺.`9PO{;Lugٖ}FVT]‹zLFSdh^; yR7Ҫuy,s%:$WIY7.mԚTJTm<b19ֶڔQT>=3 k^ooF@ܽr q?T;7zXb=pK$b~߾nC߽`ijw?_B2+ F~ۏvgHܿ uxrG} 8<;wX4UE"ho^ɬ[HH+VUC-kNEZJ-&!ȂLL#˺$!d}!KkBPl58ANpGjeHR\/R*cub(MzW)'J_T,_#YnZYi7oUY,\aal٫j32* zJ>h{v߹; *Gʿ^METZDJ\g > mUluIih;-Z"v, ]׳Eܷ ;Eim{{D'A?b?{}=owgsFܱ]\B<}wY?{n7iy_3y/|6Yr{strhS|Nju9/j[gWBfѱ}(N^ qW %#tZ6\ޮbEZWGC$9"ȇ$߫ȇrr˧"~x)%sHjI5kXr=5=]S]U=-_~bф_jOvR~-]M '-'rצln`'_Wy:1'AP lXTq+!UZ+=ĢJn x!RxuH}~T)5)%:tÈ),i)):s)8QwSऴEiMc A ,.# &Ԗ*"/QNkcTc[J>d H'Bݍ:%4Ԛ*0U67חia|*['8m/4-/_*/vwTRv"5R<3oIOXx_>?̮D^ǿMUtȣyq5ZR1S4E7ӎ[Ţ_Kh#5AxEp>}[U[C/b @P.F,BΤ$uVtMAS@Ab $]E#Bxx]O0BJR1f5@C(8tQ6].D54J/|rNIFkF x-9E% iA5j&a6q^ho?ݫ~eg]1og=V3ΫKid{zs.J:My(-Mtf@ohLfЌDyGeX?MG!nrM;b5PlOmۑ߼Anozft3&R/ 19r$i)e-W8B b2ϑ8y:ɓR"QEhUbi,HXS]1"YO*,d>5YǟDyWC=t[kG a|$LQ` ICҤM Вfy8'fk|Nϖn6hy n]޿Ǻ.ԝߤaK#_){H3=0bz1#^}@!{fH?WFZ-o$}+ YZkWd)VұV"N8ɜG|r=^OgkMgQw&_YWs)b6Z:᰼ƃ*N-hLzND`f|A]=&iL*E M.eI CyRr_.֙̌R~QdYݧ=><Y/"VbhCM']=t` 7 s݀1Ťh 1 b3v1-rR3n@H-JjQKMѹ(!e**9EHo%i UF@97 sim@cdWv-ķ?l͎+k)>//WmB_rCϿ8dє(UR D j$xEc nR x7)L셰ޓjdo"!K&᜔&ZET%BB=9;qe*`sw׸gpSs+1K }sfy};6F;ИN{F>~.yg}gӷRg)|t._ :Yc`Ϩ0ba6$357d(h"*>:{,h*:$wn R #^xhZiq u?+] 9f0Hnnt(.CS_k&;yI+!%j_g+5^ m[nqKH O9MV1^%>M@NRuqWϓ?Y]݇|-QOEGCi{cë{Jٟ'zW#<1\{C 1ށn{@ F+\Vmi1p#+E +wLyĤqk'5o7Rî1 ] !(~KHXLwý Hݬ}Dq3#DŽ3`Ӝq}},VGxFɞSxmJn_)f]>ϦtǕvt޹lqχmI7`oQhEi="ws[wU^tmqʻ/V}h_=7 .{ul*D ?9Տ}f*'bsLr޸V:"iI냋PѥeJVJٵN?[}qfCNjj |Φߞ_!mQQGSH ЬIp2 UΈMm%$Fˇnoh:͍o8񽝐Mҧ8PNm VLGD&PO_)oڃxD^ |ޭ >5XLDCW,(2:Tz8򌡾>Ms3~zUШ^뿳jsR3'`;_ZA5PX)>CNm(>1̃N9l,9ڋ<D6jkdBC9IѺO+qCxykCws}5_1Mimy$OSjyϡ =[ޢ?nw]GF0l,DL*Hl1.6 B U 2_rRBRڱPZQL)KCÐ%yH4$o!Ȩ))b*jVB'~*P&d`̳〦0oPۘ!)=@: ,p9ȀyX#5P+D˜$Xfo*yIy^[|QIHfHQ|@$b2:9km4adJhF'4kcgcc\u];p3ht;P v]eta#dfR'Nk0]H]h4)!Ǻ'v-v}O~L\R :lцbPiYTvZk0(S9@AeFB"v+sDG-Ӟƺ[ xbVÞ[J[/J-܃=(/}pq+#|Kز֛7s- UZ~-KlZ aᴵW|btէGƇO05 q<m!9h?{HJ_vW;y!C/ӻ3Ҍ=}8Byf0 +~"w`2j P/"hz ;#nGyÓFY3L$GE*yRr0iDmIV[2 8rc!hjߪmR$)MMw[ %QMnѨ1Y}[5w&Y_/qk' #¹Ds樓Q3Eb>d gLjF'^{ Q#8Q${Aw!Z橏p(t"8}du;DUnxj[kSN &'۞@5@`[7_R+v G_Q&o+`r@GxxzRu:b4g_oKZYP \@U%&JAZaM5s:/ZbhE2B 4~QN-GT12lt~CaA?}8Ϗp3?N>¿{U(¿_WxNޠiQW޴iaM r^gW _* ѨClcw֏aFpK9$*N{'hҊxMsrK!!8UdAu[@5NP Sr)&9f O8Rb&{ٔe/I.XİL)pD"GhRԲӨVfej:`}|5杞ZVНh#vP0~RF~ʿ*p SKy^CLHr2)pLLs̩%߾RԲAIU9`ؤ2*{ݠZ&pRJVgC4u(k6JP:V()ȝ "锨ăCܠ%SXmmeP3fqW/׬û~XŢ?̫@(/⠸ykB rT˟A?p_`x?XdZc{B놏I"yo{KЪn[]Q^>PeQ_3_/sb{3DzAC e&öXjSꉼvꅞ=hVtDB8`X!ɪu9w%.46cl NԻeA}:^jzBm}~qZBT_biZ{Ҵ$ ̀Ah+t[ f&(5tEFR^a8spIeyݷX*0AI ߯$ P_HKBHFs(f'ߢiZ Kɷ"㎃ƩJh *(IȤwj"$aP\{` Dx>mAL&`BVHu mm"H!u/q1!QYU|e/85-r@^'?ӖzQ !jL^ze Lzڀka*'NkʃFZ'1U)%'!!XCCq&p"!.aGb% zV$ ˓xuM0'Z]&X)Q>^9nza[nR?iQA'+|l6K^[n\m{r!Ow3]'.㪎@5I K%-!YWBeJkR}sK4Tzsy*|qߎάZgvAIvA;yaQ؛߅~RrvTe3a C6$57:Eb9Vp[㧙p ڽ1Qkg}NL0vE d&Z ÕЖhV7>Z \E Lb "B5t*BUF9ՁѕfB6iWhO,fH[*պt(%] ]n]!`pj ]e`FXv^=#bt!`!KW•rt!ZMBW4Qtd}6r!9k]!`%CWm+Dwx2J`]"]r/lg.F8"}.'\Ni ;.,1(8_9ގo 8}|9В҂"O"2Oooa?n{q J܋72v(TɸP@O : q{A$jncgĥPJU2=54 . dz2yU .~V(\ X !ezCF Po̴2\PmVUFidGWHW"ʀn ]eZ]*dvut%@O>BU*Sv QΉ|t%Q[DWXSpyk3Zx*!ҕ\3"BUhe]e]$]i8mv3ж55tn d]"](mSdpi ]e;o蚯)f;7y}6C { lݜhm]m)(pIڢĩN )([A~x8gx53+ƿG`];KC`$DEV$ueygFSEU$i!jn=ODۻ夰tlRHKU:;i8 rQ1З`>V!_WؕlU(- RR[NAR0#pr*w79^wݗxPxZǯN N+`AeJĔN8S&&U⫴z#]BkDv_'uiOtدcم@35P\_hh'Um&&(AQCRNBikͬ^3-r"do.'} r"d5݉Q*9ЉR0pn ]!ZAx*d+41mڢAghUF)-C+@m+,i ]eT2ZnNWBFJt*V13\UNPttut%M *uUFeGWHWP@wm 6Іh m:]!JX(`Е6 h X$13ZxUFٹɛA- E+kHkA*|W]5_=;w7,aI7+ ]mRцlb]m,&hYSF BO=v ft(%[G'++edzЕe:])ʭmXxBWpVa*R2T>yq/b7;:Fi]kLz}G9fAFNdMsL^#-8Dg=8y"'c5[wr#'&+'W³ЕMntbNtutu'+KnRNW2]!]yNsѕn&US_': y3])` *g+ n%Ӊ{xp ,܉?z׷ӕ5HWf3+Е=ԉ1U E&+LCW 7Y h_};])JNt+~+[2]8C.CֹZ1z>y?VEW]^:֛m Ghk;BVE4{;ɡ\ƽk{ؙt8nP>̢mE)P78.Dt46Gmd{#+J5nlLCW MCWJQ]#]n"aRg+E (Ŝ1ҕ(T3]).g+ NW@zq㡫LJynr͠|ω$J3+;Rn;JQ1UL{ϴ])4]n2ӘAE6OWrkUJ$2B4tpf+E6?w(%CWC_$&~̽ۃ/d=wm<ʭm+bG=)6Zvr=ڭgYhٯ]nlh=O|4h0qf{tpe3PO]!]w} [ Zӕ':F룳ѕN< ]nfR:e?h*dpG]3nRNNWOSGIWb%NEWbgJ8 ]-o *JoNtu tu1~º#Jk~8奶Ltnd4m&Co/<{{}Pww?b_/;_i?>EJ#kw{E__tA?unwK6LJ?6N]?E5B B`W5rlowv?zq§g|r$_<惿40K釷R>|ywpG[dqݵ,P4R?'/TՇxߎ~PD;ݲ ˢ2k q }̑cY0M?o9l}ofzvG8 ZbCIȔn [!fU||{wL}A{uoo9>x?ȶ-qp'$Rxb3lOh^I6v2-B+P0;/j#76CL _cl\lޚ/wB  5>LC@1wɦ&`F]X)somΌqh9Ek\&Z=u{(Qi,shIl%'ɗ ŅEH)ݻ!bi k+ o5 z*PR '-nH 9"{n<8  9| }C1Zc˕1Zr6A} <̉~ ko MMД> R=6}d;B40.;[Pa 29M~Khdƈ2jY|5~gb1$}|'z݀FnB~o?\d%j]SAd*هX :D>,O>7!Xͪj28kC9QJj mhF lzMqM>)d-b'܉s3reߡgs|QM^B`-kB`5H3X/*кIG*M Z2"lj]T+E6Mr} `G$-Uk9H(2!EEv\;JCu4-,Q]Zu;a:[g^F*U%sH \,KAm1x[ h**u(:ۡ-"@k!O#4]8w ɪaD9sC!V}(iW"Hm!ty![C8Fn C)Բ*.t-#giuv?K+5rL=V+B@pl/(Jl@5lmb_PvPc^@n7((ZE 8j])(k'F_od#\"2+wop2v)&t!yiM +QK}*PLl F򊳻bRdW"o;ƨ͐jPo]P?<;1l a[>a ` B(!2}il}uڂ:c 9  & cb *`*wZuW9X4hu9r HQ]V r 6P( _ kPA+l*xc(Q@HseԸEp5U{ 2|Yu 8;w JH]Qv Jj B֭6FA#d^$*3$$ dQl] EnMU p1] ʰK.vPZMw!M^21OOA{X3fAc̠ʩ*N/ 0by:/ˏ7\5 ?~`PhYO|`H6#&2 bPT8xiOPtdY)qHC16u 2j i<1YW ]!0Ao%CJP$JDMk* Acq;F+]Z˂1Ztü &(W\d6E[(w2hGxk`dU% 9աQ5;o:; l;dzeA4D2X O?%jU)V XF8;K+$Kyρ 7D2RS}I*q`8#(4D42ݯEx8gW@$92vQ>() XV_U31beh h/DጰJ":)J@c'Yj:LlM5:QZBM!jPk b= &qWy a},-⬰%flg'jjCI?VgHWYtgWM529 j3YTʭ^o5{%EEQVJ@h4:[73zcȷ޼69eq=zsq~/wio. LM ԽQlFPI6 `_ ;Z<,Fݚ{9&F0٢d,h Ƭ_ˌ] @vUFv٨s8nHJxKT€ r* ~Bg؍ -|-=LЪI*WsHwTP=`ǰ2]MUڃFP#@zp}V=5^8TVh&8`uSj2|rwp=,9F~G$o(oV1 /9Cڢ tQjFR!ᗫp`\3㨩1dzI⻃6& D[ /+5VzP[FѺLb2Bjb  tM%]@u~98p19>@z7ARw!*C=6(= @PHZ5Z3Ea2SyP/FWfiq0]pdWF ֓"8 NR #t :h Näeca՘-J2x7]BD!c-ձԤnYqIP^.X,Z(Z@BH]]~Fֻ+J 1C_I0H5'Kt!ߏkqR0NNB<spn&y0G/윣0LܶZ񆺛>_y76oqwnUA^/ҮRWtWW=kҫO`@?/ַ޵6rcٿ" ~fw3Y fL [۲d=A^/.F;*xyx/C2}JZpJ) (vvWi6ۻ$Kvtxʶ:-Wn<2e,@g\/nҟ_5N gKWU.hMǨا7B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J TUY }NMg`et5J U5gZp|*x&q*OQ J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@|@KmjRq4*R\ëQqPrJcTL T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%+B֤* G pF h+8P<(@MJ T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%(.WB=^_'?MRRvy}^\ZP;u0,^ LZpI_FȝU#\̼̪fq.}6%cիSt`mj/CG R{ЕBznSMs`]M;3R2WB)7eNlv8zme ]W" ~lX&2VȆ҆C=-Q_nE$*/q[PҚ`&O?mbZ" 4$,#To($&<韭spGaD=ii=7wJ_[V rB7$^4_G6Uji;yNg Dl5";߸cce` U" 2} J#o8BOKV*p ZKNWRtut UDW1UkM-tŋZJʼn `Y rE-th rA UY,k eQCW.UA}骠JYɉ `WZ誠t(3HWGHWE'5v55l6#`Ԁ,)\.ӿ3_5X)2?s2O?/F7s2|4ǦZ]^Q2SEAalr'Mս`Z&RX 'NV,ŗIm 0I_6 P֗Z+:}iGtd*Bvy6'ڧOۅǣyhQPxkuT\ެ*gJ)S a)<]7Kji?O[vkuVME ˆۚ/@?f?}LMCaXHQV*Yh%}w JYB֢uf5}骠d곡+cSBWo u凥Zze\웸NAWUO=j*h;]V!]#]IA{&gJ2l.5Aq.7W׭7qZZ_jAHۼۓLm½nȽLQRx`Cjdhl-b_V.^.-XhYأ0)/Fq%s&Qmɬ$ɼX]-rSeKm3i8Ϊƺr10!KjRk>dS6jI^k'Z&ÜLrQBLMOzZH\A)FrGqʴi "U3N^ w*(D:BTR++ \Ejֈӕ([#+ +fk U ]`}%/+GVDW_PPrtutUN5";W `AyE:2L1ZS0Xz \#j+@i骠D(2{ hU*hM2JIл|Xp{ Sj?cW=W{gte+tܪޚZ誠tUP2tut%%?N[d()JZk)PGE'm 4}{Q~o(pE5~CA+uR)oUS'늮 \Fj Jtut%Dq+Xj5UA)p "k D3\Z] ]ZT骠D(J K0hՌ] JÑVR*m=tp Z)J#+#ū :+, \^P{A+i骠T]#]Y?h/+l+V~쪠ZtU;]ޞE>({6t\iKW5Pھte[TJ1ZhJC zʹȕkz=ʭ9&{s^3L^bw\kF<du,Vrg٫f2X5OìdErTqBx5\{%=ԒP 1FrZ-8%X;:×j˄k!Ioo߲ʛ."[g7o@M-((?80qF7`L\KZ4.sMNv8CnlcDۦ*ֻyVʧzm=[>2tѿR|(IC7oh282rz/m-PiCϦ>^\_on^7id\eˏgiso> |W!~H,Жw~hJ_79٨ΨxbgitxQʖKDh2>lp΁6]ѥA橉l.+qJr?jSP23C7B>罥B$C:S#u;˝^im0[=0b^%NܖyaEћf*. VV ]e{:o^>z9'ssa.QKWy[yu ߻ŻM>gh ni/mhf#\l7{2{bOZ?wpӥߟzHMGO_z b _O3Gͧ=〮pɇ.7SCvCHݖn>dD4gufgkBR6+k=>o?*X!答l:}$݇X3y6[s6XzH|t}-RٖLs]u3 <4ݼ&  K.,CFyq>% Kৗ(iA.ܮlg'TrUbp-d1h |(ϸ g9HLďTw/Jn- *6oWg8͚Ug\oM/ݳ$aiDtel=&{7ppz-)uȖ ]dKzْ"ã: #$I5J'm$+(긵7qhC{!yv F'( Dsͼ Ezai@1%c>'F(ɢ.EF;;tƗA`3& $ oZ$%P y<|_\nmecmE%$<ל1kG2S\eYo3*HGBvVKà2eoTs.IQ[%QtV:][JZp!fě襵HPX48jבlq:h)l˷,.4k }C3k$Q!'Aj 7YaLK8bٹlY.HO6~|vj )M? B ]/#4ZOʝVJy%p&@ hzf)LҬm#A8x@F%u 4^ o<Tc1D6>x%02iNg9bc/.YUd$)EI4\̓BƵw!d!2H #vJ9n %6h0-xk"& RJO5@gp>&Ov>=2z})gmWz`|uJtzBx;nBAyG\-,dvq+qf B݁W$54X8EDb, ٠WtMfmfm/7[&2K9lO4;N a$;PF3!Z繣kK]o~<υޯDPBl0"TP'$45#1ed4.Y9 v -#Vydx~c$<4 ]*/%: 5=i"Vw_ל[>Xc -dA/iU/Xt1_ 鞱s%C!*)S"*'-Pü;cŭ3Ap.d:eFet1{O`_3D &cEeMB*p.Iu͌aXgJg\ؙe< E\ȑ _ oI3>jRiWT@ a9$4pML#LP8Gcvc> |elS!ƾ0bg\F3DFDF|%E#gFZ#!][S[I+}݇Y^ODLL<8u5B7K7$pAHU眼|(T(!\LmM- 6ıq=e\0ڙf)`1r$-ɠ7@5ͦsE:arͤdShE.ޙ^(ᝊ2aVR%ɑB a“H`R.n.Mt=4-\X/bUqUW?8Vk~gWGF~ҠQ]!e|Gd4je){Z[87kyQf^pq'WGޮJ΃o;٧7xz[|ez_ &!U(RAC0s-fI(EE׷~ï] zߐaFxJQ]Sr(_Mꇟ];oz?pxqr©ׁRְE8!lc+>Pj"9dViv`zz] _'iL<9wɠK9_Gexhzo]ۥ ˋBٻ#?ZaX3#Fi0++bʣӉ;Xb䧣_y|r|qőX;w:uV :`HX<֏*pu08.Z&08.,c~(x "GvW)I ny#VC Z^?`\ĖfܿC'ibB,c̀b+yr+LwHhQq*7'ł1((ֈhR<;2Q+M)4Y͎N0NSo{ܸtcs=LH9g,Q;p%$/I8YRJ t*8:@1;ѯ4Y˳}6L&66hݸ:Z\_*+ wSihIZ#6a՗f{:;z序bUTI Q *HUw`rOyCnۍR^6Ө/罤gMF9#њmQDd .Qʬӥɒ:<~:(2m8cQwuvוr&V>z4:ʣW]'`0X}7tsytz5Zio6%q;Epwڳ'g;۠jbc}~?kYw+_`ƅH/!ĩ hvۖN |IQ޽*x*Hwx^Z= W {mv͑Qvɿ.w&gG3DtK v:(K<}'WڀiIh׻7KVYb@R z-12RPE:]>RVYvivk>;N V]]ܼ`q-=\&ZCetY;DFgg$1F)e˒?h:ZVjGZ#ZDdE` SNɦJ ؤ"|чAY x^K(),GXM~%^YE 2oj,:TtO{ݼ#E˲2w91$^}XǮ;]kN|7ݤppl.>>mRDLFt9zQuv*aY@n'D{NTܾݱ-ͥ}=xɚsrk`E;!P!zhï`TKF"12*yy՞|٢6)OCvyЬذv4Q*tRt?i1x]Z\dT1{WtexuGx4i4ԺlXԇ_W_@լ'ҥB9vub1bNP-BՅ>CUC! !U \R*u{ݺ(1xFZ5q,4%e2(׶ǸR |N( [WtT"BgA:l:CU݊~B͊渫tύ3c8sg>C1kp >z#IYz ;ЈZICG.JAwP [b=ZOtB!D׷Dbm _P#ۤ ar`^apcC:šlÒebE,dRH4H̤c(TQ|X|TAT-f"u腀J2yV-MhR(V7VfyԻ{J)&$)tIDk`ɰ=ƘT;"(e1egg߁7Vco(9fŚUdKT*CP1i`!|RvxXdȱ^ eu0Z#='vl^PLNy1m2tx)m2/}oYg)od3N/UIR]PDQh $~f kh.obH jbS)%XS *֋~!8Z$e1 kmhC9&&]1mp426djvPt ݧ 6ް9.ymMR Ib8L\嚝)>*+볁\(@kJPJ"GIL6fק/9p*`!'m6m1F$b5Y*i`QCh$J9%6Ba:Fɾ-6gcwXϚMz2iw*ޗ@0K)aqh8Ɠ {/ؽdԚb ĸqW/Egvzq>VjFBCrY+*#QN2@AYZ͠VqR'xl!@#*P YrZ%ku2B-ѲKo8jLL&^Ic`?M+S !(U\Xlb1A Ue1j,{mkՖjxol|XevP"m Ggڲ=y#$B%U- Ψz,XOϾv4MC6X6J+ħtS)HzyAB_8 b{睛.q|_yGÑ:jPؑ >Nt\ϓqA,*f6IE(Od %1WɰbRdv:!A)chK|]zd 7>ql g%yݘ%9S #ݘыc?Ќ.+W/ZE!t|! (}AH?eRs H<R4cֹ䋂!yF:zUQ|CZ.UT]#Hٰ4s؜*!tEfq:guZUv 6}{Oe)1;Ľ]vm!g}0RjLN6k\iv4(E!ՉG0 zQjZ4ëki1gQԡ#}璍](MOF#aWTjpQDp9FYFfL'{1@v`;83r0xuH>o-jY[dw^tSd5XElo$Nυ@x"|<_c\Wϭ?:.xr]ݩ5}l5K`e$)`;-`6&"D p Ēs QȓW 1:Zƴ(r*OH6%S$f kAYü{a ^kg==Lf+א hCuO|"d7'QZd8JƜ Gi֪c(RJ!Q*Qk{|K8XoaۋEM<{ubjIP׊gYfIö :nq;<.6avJ}6MUrX=iSo5zTBJNI4XBЌ uvwmfQzq1r76o]~5-Mڢr}0g_oېA+YV02 2Q;2wQ:r%;e. / uh1J9e M I=$+W30y !BK7 zȿo64[| Yn}m vic;$o북sdCDH@iNڐ ZfǼzŃ,eFwdx K#TqVmK}qvH~M8㷝OW4=dr,jd${m 9.t?oGyxBпLG:$jӥמyt16މ,,J>EjK=9}}q "l^xK<mu鑤$xHJKUSZ0oB^g^ y)cReczVY|~C=rYJK$ɬ3 + rvBpUF< p2pUppU8K+vi=5gMrluIW2-FW%0}zO}}vƧ ):Z&bdYØ:)-,m<#kl??Cŝ氣ul#j W% qgj",BAd\TnukϼP U\;.^Y(^e M'q+3l;0ú vos! a{W?ЌGcI>|sLυF꾇|pazO8 #:O=|RK)gW6żsuR@QT\I]>WxXu^9;|.qƳ>àO<Ň:m(9"ȘZ$\4ygmЀA/}Br̄ɡGٕ-.0=noqiO1]fN yW\\g#,ĆuUuoWi*~>6?'":ՕyǪxj|_,JLk?ɔ닞HnFn&|:@, !VK#$};qbRlMϳ´d'7+ې`L!bVREm@_hOcj1F7(G,e,K>C={*Y61IؤרZEFe{ F i|p |.7Z$kƏ+7jbvН>Pӧy}bmDyum/b$)`;-`6&"D p DJ 8 <{`oJ#eL[K*OH6%S$f kAYY{a ^kg==LfkOkçh| ~Y>g {+I@Gnf#_ɎΏGlO uC6e[ BX= FV3G%%a4XBЌ u37on!OzèoQY|y_[0N>#.V*a:e2$0Q;2* .JGympD(s0 C)P ) dd$p#Gb\%Y/ UC^Ќ,{~YzJG!ߛ7$GRHt҆ `2k>E+|d)/<8Y2\ LfaN!l P̓v< Yurp9BP:k9 b>eȀ):2V"Go9N?:)H8h x Qj#^Рi_#}l}K}xQ*z:80nXFrL^;c2^f ($S 'g;[dia|ŽN['/8䕧2MHUJ%`rJWD\Rw-bRlnhwbU*-Yx_ BdzA/+ުf^_mцQmmci#My~_5=*^؎FK~afzZn o:t*9{喦 i/^w]Ť1M]M꺉*k_wT[Tm+*A{睢sa #Z#^LEUL\* 4V161/< FZmq +ড2%CLw~`ZGt9`wDcTs:9ܗ.&KFEEQF:6tP[}C@eLNyKt>MpTQ c6Twe/B1컏bL7m. ie[]z$)% 3LԂy:K!huw-ð˶@<8fy$KB5>6AV@[ګǶU {|ݝ=OxRTxu[hkrJ9fE,9 `* "yL6LkS\Ϥjkjl5V TӅ8c[]uuΫ,[_d";V X/<83ߝpzk\$m}4LŨ,3ix1Z$Ջ&UfqOr9ٻ̧E0F3 K0l:;BfՑGU>v ١YP5nH޲[Rx;xs"L0lH+yv7m]L'ϧ8_ V+yYoX/Ύ&>a0Y,g YC Pime~mChג{QqsB )̦%y<W,¢SBȩJ]mAw=o }Rl;cUL=B8vB Nz&tΩht[j rIaۃq@p)ab>8_*Q[/5~{aoxEBlxz^+"$Zk|1xg{GCs*np4Sn+vMڬټѯg^]{W'63x|GwY`t4r 2 ){P,H0d)rtF |^-۟VtY(Q2G `7l)48=E-Uz} s屫;#a/yn2 41*ɶś )E8YW(兢MD3SjN@O*L``\1ir/HuiPZ8?;zey*J|Fe ' 6Q\1uG?4u p>ÓԹoB+#</^qIgrR陏/a:즫a~dn OSo#O2xm58J&д}pK@qL#2Oǃ9)MT9`iv;c;=WMb$c׈9|2<:V !VpQձ-&G/#qWNW #V#a0י bѸQez%<|OjrTuF]uZ82"x˭/x>w[oj$ø8m|;7pa₾%d+ RTxi~ɀh<9Z 8MR%Q5q߻GÃ8<;&!/{sݫo_W~ra_׫7O^1ru&{ =13VCK j>l7Wʚbܿ~2.מ7VXJ%DR;me4YviAr;)f-/34(#6ASxrCo{p\4sh=/qrB"/Qs%s9$/99TIԁgcN})_KJc5*q٭RJe}zEeXՇyo.2MŐOǓ!1{R.PL?VS;ϻJdM'8K6+Y7V5G>2B)ڔ]"sMP5YR@}+-o:l,}vJ&kRwevY(gp}L^I%(Kd:g d1tdʜVշ'mx+{Kf~8I-2 888j $4ɭk?^}n>z7_VBV7iq)77~fe׬TXmt;F4ևgrY:^Y5c5fv+|oR^LQ(* q5r u0gnykE:a<\q5[+*K7R'-_uaʱh]YZ{u3]vULuh(D?7VL؈?Nk]e|هƧd0j[5xLcLխEi$?yy-YU:f[ظ3^ZQ;}rVwe+wl$hMq13 oSv"9YxJ|0 *%RpTo$OvoLmbr H&fF&LR^HI>)iV'O.W0[Ps_z4Ȕk@xxCiC3w7[vO`vd)lEK 6!1*%ex#%^@ZFkDi_vD"(W7/BJtJCe:C* yZ[ Cц\7_4t+[laڔ6b8s'3[wX&!#ALQ M:PUrA+/s#D (vI{/iE[5EbH+ Hv*Vd-"8}d [=vF@o{:{iD dStF3$)dAƨ)@h$J0"gdFBb6 1J$l9u(Î3pHg4kp ]WA&PC~)ffL{%̬+d ;{$sF}>y;UF+5h#!\JP!9_TF8d{^LՊao'vҞl)`Q9RΚ*)Ϙ깵5n`_z ة^e7ҫl EL+S y e*IR.ZlQL"&HUUYA`/˞ں0le-g= E6%>q+w "IⅭ3FD%&o]PRd*"^h `/:$ e/BaȦ!;*X K]Vķ4-ogHE@Bl4&:,y<>dIㇼ.Hunؐ >Ft\_x`V*J,1"f6h g}v=dF!E UdXB PbRTdv:!A)chK|u{0|r]J0}nx3a<ޮTf}%tN' *;7 ^y*G⣶޶yqoڜWpIK RBiLt(!R[!/oB"m({lJhTmml6X6xlM<ͦٶfֹ䋂!y6!!IT(]0)fO5׆nvogw*Y }vY`ΗLַ;zVCZPs.l6D_Lr*3%kk"{FgNꔭꉷ7M{&ik+ם0+ZJDtFw_e'ӳ/7;0e|)"&#UCRƪ`fl〔Y@o<Ǻf2uE :, 5lH+4'<~ajdu(hx*6;] HTɻȓJ! =S䤯}m(v7PzzrЉs%yqGi?VPӸ|./AJԪsYE{j*Q_dy{<~j4LJҊ:>{B|MzUڽaWG[+րuvUgWϑ])&K*K~7X5fܴF6jV0I^)|%p_tvkd9Ͽ< kUZ]~{y;ba;QѢ`G)?_ я(0Tv7`ұ?/ۚǣ?6US(R3=;K=yse2Z\1s(g5j:&OO@Q3otI^d"lEjQDkWIjMY +AАwq>T$2["-GEΓ0/͈O=/"*kG}C=v*M* Eᚠc>d$vsRGulk1ʒ}*nU(-ghɡdQN{Į`foU쾰 ]gWJzv I;7%7U]gWTSðgWϐ])hpؕB=B`p⾰+VK슡4=zJ# O  ]Up]Uhi烦*zʸ%r6i^#sp&-b Eteq,ޟĦOPN#n4YP"ĦvuFԢ]_~6(]+/\%vn*N2?ǧ?7˫9ѭKC< ^ęc{CzB^K%؀pҺLdr ‰XvG1R- mA ݗJiޞI%ר92Ԕ/ҍQRZm>OaԀf:\nՖ! ̭|: GdëLJ-7gdžE/֗/^<!6ؽJu,:I2`Қ Y(R ztZz[`QX_KvͭGە!>6k{|u[/,ܨ$n+] j콩AV,! id֘փ¡E|0З":u o${;A$d 6Yq̮ظIsoYYdgo]0 4fր%B@MSfLC&LJ^obFW 7wxJݹrë < ?_x{IrԆQ 0\ wQumUOaƟژ\YfN2s~M&>濼?] +ϥlU宂 fyN4gR,)#j yp*F~2'a`2e&xr_[Zd],~ MNe"Zfp5Kp^~^}Ţy_eP=/ƇKkxq"|Qxk///yɬ_a?Jʀh|vrQ$qPIanfLu6 1闟޾o߼}Ҿ߯߿{}`@[r ?chL-y0W)h02!V9 V͋>UQng{hiFwkE{̺YT?Ĉ<}P3T׀Y7+; /PzICW Ηm9.WHsơ}0UxVRŷW)n.:^E/CBN,C I~sI _XNN0)Dn5h h{&߆óӺJFuakZOip@Qtvh>fbuۏrc%6b$(\щTCY!”TO{";.x,`uT"C$)YRPE:]XRVYxbp(TM79x\r_.WYkUX})AFn yFA-c&x(uLԎFI&0P{wJ`YJ."A } R^2@QNe>ʎ3pNZ튈Jun|%)}yMLT_Ɉ&G&;ن%")D}UxvDhDMT MS^;+ kF$KRjxX6k$I '?TKF$NdٖFfh.ݝ$+*+#op9eEtd mc"(Z5*b٪73FryЬzduh.ks'Tz|E0<]v/ dO~kMD10 :kU95j!{x;4VkԺoѨ aj[EDwWtЩ[&I˩9:~A.CQ.fq@i5c+$?iQfϚuJh]VeZkC( v{! dB6<&j98&aaU $U uY )[5Q_Z}ejoQ7rk"[ml\! }i8y:1˶хoI#~%ػT-؍Nu]"ALMY $?@"hS]7R@.fhۨ #g,2] R++*1Y-MSԫ,ub(օEc*iTզXoE e0sTXte"p"bʞLFp&bDȢj9eRVUCuob9;dmb @CF!1-/Yr,,΋N$^i|_5$gUK.N8׻8(^#8֮9 {MZAZ *vl;yX˯ٷZ~ލ0<3^0499PkVN31z!زI%mѕ'F,dz-XG6Ks&V)[# I.P2kmAEPT(dCJf[_bLIT"%%r@]lhBZvoՋث}RLuHΗZFk7bGHfc;dTjidR[žNY0N2"Zʐb6'ƹdƧ mrZZ`teXb*aLQ KFRr~u39mO d#6Aj[?BR嬶*XJS1 gّC}ZU{6E1)oe?dOax) 5*MVpc8We/r?t ٠K `d˿b5A`7AnP+_LzҤ'ϖ)^Dt)9@ {؉~&^`2 .cSu18XGN{fjE&"0X[r)XYeUu :$ ^ma+%z7k<3Vv`ȊNdhgֲ1hhwrYd5DUE&,Htudk06vcCwf`xrHkmnM$Ca9=4{ HM^'$*92ϪseK x"R.{r:pphs򃼆,*IgE6M"f'XPFA:9YQ$WDes,h[\Qp5ċJ黮POǶ~oxo__>SRs8c \}`P07z(YgeMe&y0޽̇˷QGbև\- k5P߼~+4r_U\Yk./Oj=GĩL4T]`!gFW{/R9c9Ln/2umo~gMmHsgo>rN UI."B|7E$Z."ZcHrEtNE+#YM&uF^~?;?= <xQM>ad_;X"cHqCRJ΀B2& e[#OS]]=j251[T`5Wk`4"+UكЍk}^Aا'EjIϯ=Ain5.cvjr /k`Cnuȱڬm a`2UUISl[klY6qW \H|I>%,H.cT՘u!]jVR7wggu,[ޛ審A= !#:$v^1^5JϮN5EyX#5:AQ *6V?D~pZ4%G~P=r"k<禌:K}Kk_oNUrE>MP}snls!#f?Bx7+xBՔQK& *="lvs*fZM$x#DϨ${<8kJ,V^G gD^eUXt"PT$O:擶? tDO%ǛMDʢ ZI#!;!Gf47̈́s̥Ua=b9ކ#<+Fԃ@Qo(?Nxo ,)\+D27SlEhq -sCsnQΔ)BPTc wNs3w,6 ~ryHEzv-`ϧ+-^I[޺ppZV,|W3IXuY:%yd2wc-~&q6)'SQȪ: G@6{QMl1D Elܟ!_rn7nW@.P V-Ӡ0m`@!sc]Xo|c9M ʧ, (2[ S9nGzc6|(]V~t1#\#RQȺKz<[TP'Atq_qbN?/%B)} RYid# ,˫6&P^./vs4+=h#W{𪓝ٸQ3V`]PtHSHlL*FIr27It,ujb &h9!{ ɄD6Af&(6g'>暊|BFq7e_cwqnީ5tA}d Fg,E9UꪲzA1b!dl_3򔍋ZXJk!)[By8] CI+ٱdV"K ,˗'V5Du9eڨ"c ؔP0Dwmt_] {7AIl)+Ib.!d!j$ b["ÚӧNcz7bĜ$d8 ohhLqsي]VnuaY_}cC1#" 2V(s)QR^I?82%1 ye4R;Tǿh_{e6ݷm߆dPM>S: t`P{`~cYHyzN&il|2 %4pQw+q_Yg2RK1bRF, Ȑ4"ڻ2qLf 0H1qy`裱&A?omTD"\SL=gz3gI[4 $mibnIJ:t}iu;ңφ]E['ʎsuGlKpq>+f: HKAw5<XF"xqK,ʊ^h**D2{.9V@FӨ‘eD6k67!I(de4K=Lzy^(L2BsEH9;B(RXCi\ ߆L[`26$I$,y)&' =9 )B^Ȯ/s;jje)4]?ɾsJE=8zEk4%SHT%eDĨFx;c0Ad112:ǘgF:L01[/KR2T؛9;TUzFƾX=c!T,|teUakWoح34 7~@'f_8bgfE\+!L fU VS\ Pkrƒo~ dP=tPdF]Ԉ2v%%&*bA7s#J9mt veYDRCh5BupNJf@kpdwG,v#E 9$MXň" G(zedo*`<͜xcơ boc_D=#XqӈˬFZ#a8d4 *K)(p zEDk8cJ#I"p5\pƓL*)!(ݛ}wfvD6ѣNaHFɾ{EUqƌVr293ge:-(22IF9dK̾`}z:e/JfS\~|GƯ\PR |3:G i|\fڿ_u_!u2݊3zdakxt?T\M̵oy\ǵe[50+A_WK2ŀ,kw)V~ҁ,A_;:}(zА&C[Hš7 ;m,cD3U $NF_j?%1O|m݃q>sy5:dwߎ:uWSWhZboX^khv^L'.O;q?֣;>z83^sgpu/+a\ W>z.8 "U1WCbCb^$\ 3JeْPnݟ'W6_!8bD Mdj4\t7)OȆFǓzVECnk5I<;o/}4s'Y82|Mi]C``zo=/|S^3!_ʷ,SO0*MW蘯FwgۢqeÃBdT>ӷ߼>.+Q5Zj`vq&j'1`JcAr eeh%qQR>'ך I3&d) .Fk:BJN1xeGb5vZrlX)te;/4U1X򃁫bbWZÆWʕBWl[<* K+$WWdpPઘ RWbf^ \)N>9-]oхʰ7[R׋'\lR-{s_׶b-T𤜈J5sנɪ1l+ y|}ZTX;z5,5^024L.F?8Yx:zv~#].sZ)]rXv*`VHEgRB2Exn$rƋVy %𢨔hhaB7tWBU1*]ŠtbU AV;g;}ؑꋋOW֢&ZzY d}Y m7UjW]_v}ծUjW]_v}ծUjW]_v}E]_v}ծXv}ծv}ծUjW]0VrxbGb6&1$lmq)q\ZB%N^_戎qC&gc؄S ̲1#+ 9޿q}3thN )pWGTٖppRĈ,ߙJU`-<%~3%eM:`6\ͦdzi&B+oӹ-4LN㫋ө3,ac wx,8MC.dZDIW|w_:9Ȍқe[GAk'2 !~vxl}zأ[充z̏/<᭻~Ĕ5Zm?}g9Ab,:+Z#/u g$^WluY+=lIlNw=!Б]~n'_4^p#7d|ݻѵh1S \|ˉGN~ߺ&64|mj1O$?pmfڞ'Ӧ)}L-Yfq57frin1hi0:X|%mg/O3p\;.P;.)Ol}vYdL%,exZ T: m#ˠYm7;$ tS/uX!gi«`]TN[֣!<ĔaTRD#]:͜kίfbr~>쀤!u\,9 tlr5 en[SRSAi!Ie2 Dp)1v, *K)' gTl䲼,T "&/2Ƞ-}gfΎS)t!C7Kk9*p=(4Ux: ^ `FO!I_2B酦u tFLĜ$d8 ohhLqsي]VnuaY_)[U!Dј'^x:DWN+ 4_qdJbRS}l8>6M /^poMc[%!TӁϔVZh̝X5Fil>F+Woْ Z#zkMKa뻒Oە5K3}8|;3b8~s -?f}ywN0o-wi׵گn2DY$2eq^sд bOI4nksʔfWOMnQ>{{_xMAO?vwx%βXZakkz;r*iKB_+mܶhүL7d?l+z&\3;ZЂe7n6Rx_{ke yhTD`ۆQ\Kez~&n0#-.eKMI5"8O1,8\Vemx2i0ČgN "!B̘2pcrBeN259T`ƹlϫSco]h.[z^Ywg4w6_7';?s( b L`dhd*h8 9EB"yS++EN5fP!fK؇!ɑ.++s)!wAĽ݉k0%sog.ޱ): ̛mI_S<$f/$x9s Zo,m[Rcg]՞=n_A=Dee3FK7.D6F`FO>p9' U^BJ[A.%M0HxGGv=u9xGo6Xh3W˱7 g!j/[^D#C.LD*T̚}\%-"oH$BFS;Zi-p.5M:Oѐ ^)X5 1;=E8(˙ׁފ'{ܟr=mnT B$0h :ka6eF>TI dJЖ^tŢɸv;q>qH5!'/p\hr‡`@oDx}9ꐘ V2N%+"jpQ!E:R׶*y2&A`ǕoȔuH'˞ gFalѡ f-9_;Jnn8O_km#G/{wضY|3@pef..dOWnɲeeďn,Y_U K~N _ 84&(%0?$8(IZۏ[nazj}׷0|ijߩm. ɕ:y*_U:4}xhTQ#;%{J24㋽ ^<6U{oSK-":"$:TtByԵjr%Kr.6AEq4ȯ؆v0obpjkAҮ_m$zvvr|!cAkk!ߜÝXLmygQ[79Rʜփy6៵Q{28I #d}8Jãfm\% nѯgȎ{G8!#@H\k= ,o/Ȉ!Ƶʩܽ|cO&I稔>|]vUbQٰ<}~O|l+ %x`0n,+;՗W_H[ƾ~=~~NymMЗua>&Adk4:DҵuT%n85~xzL~s?>, >Y96&@1~CRCx mJκ]5e;ڸ.ןZhBY8v{%b{ItI>1$א#i W&N瘊hn  X0.U=?v0/ Qy}4{eJ#d+9"1K9149)Ȣtuי鮔I,'6*HLVLved-"YUXջ<4J^,o C0j.s/QɗEM$ )ˣ \(Uk>+,om毭[ ^B>r?F.ZU6ݶ95Xmx2o "Z6Kc-vT]}t߶w~?AS?3;WO&yEˮQXj^sRO'a쿿bRm/v:>\jKVZ8Q/: a➘ gDE.M}^zL9D(FGT|BI(53<XK蓪 ^@>dg#$|fvWWJvٖH8{4S-[zstw۴Ǹ.qR8*{#+5%r*WluEͣ`6A%YTe{\ ,b.M`2 ±L'pWlτʘIB81Ln~=xLBy xʋ;<.>^)x5 H9Vw 6pHAjuetOPK`dU_#TЎCDž@Z 2A(sx,e"bZKic1 $ :az"^(+ 53v#i=gzwj{Y bktL֗*!AL+l'E@ 㤰,UzWq)N'+H % O>3ICAL}5N%!seR`@ELy 9l#'@<'(L}!M).h*rE)jLCeK:":/MpփT6:VAD/!E@N' A`W%- 3H1r }zM[&ʫ$("Rit S q pI|ԊY,֙< O4s-,x9W! %4@E'gAnҜ6_+T:t|-  텤EBgz|:M!PM&OahţW28b('ЛVv3ᖌ\*LoR.l)JOڒ: !uiRQÍ;)/RCAZ!LNX4.jgh R1*{<^ȹ{eK+2L%*-Eɇ6*cF{ns:tr`Ჵ.((zʕBb;/8}e)&j:9Q%q)9h,S*%ʦ2En+G"4\-*eS<9IǨ.2b +gYvX*[6*:B! !d<g0Qx"Dk /QF*iBs9zV؏ճK^CN?8EdqE4 cJN0 ( 9xF!zU jq'z3y%d%$z<1`r1H)@ 9A!4m( ʗ^p;UM؞x!Մ!2^TrK9`&2*xM(P&E4dNs[9Җ0;|Ȧo{' *PvuURڝΜ"4xQAXʏO/v;Mg|z{I\Tum*@";zg'r6QxF`M̳Ɨɥ\ejw_+ƇPo|>3ObY)D Q_4HY%[qdFxggHyRv.w9g˽ 2|d9A0&cAWKo%)o$`sj̀Ę`$h*-X:c(=9ӞSE11.DXPq$rft ms6b5q5E}(Z=s83I7ynO QAo}چįMKM*;,A ?j(ON h*I\κ [y/Wc/^`)' n#ld@tݺwW+׬8L/xڟMl4MLӼfLPmV[vV[j}wl4ϫm78 [V[ٶ L sӤP0@1pK<ӹ]Um MNSXesUVP%x嵦VDCySzU\-1ҽUGt<3c(;q7YȶmKZ5鄻]oGW)fb/@wMp%OkԒʺz(/#htW}]U`0DzIdF&H6*ZI~\LyWz}qy2Uc>UnT^A5jxKSUo 78G}vn~| o+"7>l=+AXa>ҁٙ;Q;v뺍yw͝B-bDp,Rzr^ɥ[]pI3f&ϣKWAAځ3e( )AAer&ԥy\LQ88v(v{"k6Q%oS̑IP葥((V"N/A{pH/S` \a%XxӶ#)]6Bwbn&Smޣv#a^FM\02YB ɂ YERN-8X&3bΪdS:(ݪ 8=ynrrLkeP‘Q])w&f ^gx4N*wĩ1Ec+N+E䢴0hV 1I=t"b6w||b˩gp@ B_.lv9Ya4]9eXKo $LH(#1Gɕ`p43 :H NU hઈx,pUx*RZv7W {Tޕh"C"\ABk#+X}+W \i_kDs7WUטc+V3qpU'zp%[P𢡷m*յZFw=0W1p|~ʾyObn=Ȥ>>!C?E"(p~׬.ySl徭ʞ]%^C`D͜cX~׻Ay8Bjb>.;Z4o7n?y1ޏ=&\dߑ @CѼgeY \G o;}ߢbI`fVfw+r5KYߵ4!Gn|Y_]9=\!iQbvO/ۘ@\ePmc/n$wW=kotQ^\q1gtr\ڥĬ&e N/Tk/NϓHPכf[<f`Ӌo-{S-ӦA >(YO/x}o&.+٠gJTuR+Й@GCPHVyQ z~mko87BWVy[%&tJV:Ђ=&Oi.꣡` i9|MS\)-_ Yji< . dB̵uԈ kBF&-?m#޹,NY2B1bRF,@E҈hޅ)`E1(Onm_kKlE{v{2);bo2D7m;i2Y_۫lll̽R0BKx**UH ixEa9BH59Ev"dshq9M+6؃(\fd:srXv(ᐝqUQ |yJr6OU/Z_'ͅޯ1|=}Aqi}8H3C0L2\3yH9;+[CҸd6CЖU)lCO%+BBOFv!S^F_W#lb8䖦㚢=.3Vav'0a*EQu?Md, $e0>0T޼Ud 3UI1*'-ghdqkHs%]tcJeL̂a &cGe U)rD FbkgJgXؙf슅1(>ԝ|H- w0:.~o<#v(dZ4 9'eR6 ^`dZ+' 7BXM[Fda(\BHۨ3v%&*'A3q6#By(ݙv j O:wc)bͼt1u=-HPׁE1d^IǬ6 *q tSo:bi Ya*-ԼIJb;\{lHGܐ(Vh' :&D~c=#Fm~Cke?|J5SgW%F6ij[s_n&>.:gO \A..۱~l$j4珗n0~|@wbgz_ 6x[6lY,n Ydiv-fɭc2mt aU}U> }~X3!ROf<ֳ]O\nz̓'-NQQ>r}ՁQհ4R||~)2#xSwMy3Vg֭ד?|x, "'WSǃ& |3{ [ -N2oņ0-3O7epm3Xc hTZk[KUMfMKOG͋I5gErPszeS( FQfT.{ۆaVCP Kͨ!y`oި]Kp(ˌma^D eR@")/hY}Pfep{8 ɗSqG*htuǣ nWE3n46owoW?7?'WLS~oO\!^IyjXzYw0H!Qbw{WKQ?I8->IQs%3 *Li]~tvH:fŞps#ce4ImWb_(~ .Af~ObLxUx:5Ԝ^^]6@ eJ U4K| x3@շUWn U$N3-5e XsgX&hAJo[c\k@)uH҇Lh-s_QJ' "x`GFBֆfӹ268+jVd5GӮ7<5L̟8 jc10ª?] kt)v'dM0'AD򿐼$ yD 㿨'.J+Sކ(r*x(;`./ɏ'WĄ^i,fmA֘ L4(֡CB)*JdLAT:tB](8"Gd)gHܗ@L46٢PiIJ se,d^J by bawv(& Ds,=K3J5SA֐ -hB!j$CZKHFkjl*xWOϘ˯9\~$޵0Z<30" vńt/%,K aukG` ~;a^9]y9TDn LIB );;\r57 JH,4l߁H6&n ,)RXT5lj-]BD"LTJuXlv9 ڪ6dJANd6ʔh\2]ZۍڭUVӹ[U<7Iɽ>%km^LE8=U(*\(V.PcBԬ݇KF$`PBBejf &}<ّ (/ \ְ V5|dHP01EzL. Z -J4^ hjlҤ,pl11KlLgͦ#^A""T~|(*h˨d :I%*<rK6Y-N`Ty|-w|Ne EH. @J>_VW0B2Q;zZ͠V8AOz!d-C!1Dk5w6f2Ff)yLPl V73+`20&5;^?XGi(֤ A,Q*CJ$%X,:!qv[ZSm :"4&IԨvڈ($ kwQy1f]DxC+ aԐgw[f۞a~tkRY4l)g`g3A'}6^O-dZAvQb [$ A@&UpV;8) 1F:^I$>Z51Tus-M<|ryPȫ54,7bu+k[\7@fꧫ+cLժsp Œיϩ 9XQq61qcB̀ސPұ2@Mw^^jfv&OWCM3 Q y Qa6%P8x4ff)ŽrJ@av!))V.9Agjfssn -"rqŪxe<eg)lreGJ Pd̶&beGJq׌(c`[ mvە*8#YҌ<<8GܮJ+fgFo&l߫_ǟƓqy@l Dv>#=j ط^.g1]5NN}}Ǿ:Pk7uZ@VQ:@gT-1Bɸ49y2~Sbx.-,:LΡRWs961 #CC0w/q|2:G}N -\ADQA&P2,^+"Vb.{؎NOme ~gNfE͕}kӗT?ُOg6UJfb6jkU V7pG,Zvف1)>: BA^&%()%Q| N$-@]S/Liʃ~'q8pOz|9e$w>8XیwpۻŸv&2"srA` GF1$2 ٤R Rt*an2$I5[6uLWEh-H23L->+kß@T m6Oah%hwQdVۤxV[/!Z)zw`fѓlvL6 ߡ&<#;s)9ϟ'xaW Vq^TA~~3.miKh6=]zwץwۺ/~㙣WǏ'o}[wF@gv⿘SLB9ua Ac]jRF-6S?r9[)ۛ߾ݩ߾ӭ뭸df ~2eMHӞ EDZ#Y8gX}C֛1HA"d[^IDQ-i('9S3UUWU^FKb\4J|rvV3kVYsxÉ#-OԠ|"ims1Qrp ܶ#Z-l 2TsB1fEFC~QYL6n~Ud֯淆#e]{J{/N?_|5+2N jx8:xp:]~~ř40 V8>E|:y4 >N0+C3t2z!FscĢ8E_q0]:ӾQq}G?6bLݒ0ԇ~PNrQ*(:?ޡ4hneP۵MI'EGjhrOBWM0i?b>GS827B6x) RfVGkMZrWZNژYn8uH.&>(o} Zu;О ~ 53? {fSxlj˳f`,7W ?qIuF榏! Y Y:`T+=-a c@g&%~WbP3lCzP>Y%Kݑ%iaEɺ=tM-ïxFO^]1= 7m.Pg,Sʪ7k\[$j[>;81w pk:{"0Y4QwOe'1a;mΘMl>M̬>E^#-Z6OqSvb(D/߮:G;m@<#*!pӞ$)hn ũ!t$܀SY0/u)^nf*qPL$,ب#fyƠB[#(F 7zE^>}:ջةA\7HݝRvVq͞kR.sB_y*P5٧ =Y`nBOW}ГUzZz^a*b ڲ2=<f*ǟZ6;D3xaL <3wi44;%gY=bUTKܮ}+b[V|8o}0qpoKa',-N:͂#˶'妐ޱf>r(zyx?UťPzh0lWp`v2YMŤj\h/TOvyA;Oo$$eP*U\lFbw02vp ܐr_pCVv_R7B} sP`aޘ+W/*KLUR\Bs%8쑹AW(t_U2YJ#zs ͕RG* ޘ,ٛ`ml7W\)nQ\1WY\\eiﺹR~U+9YVISN cgpYxMn!qӊx6 M?o1ܦC/yU}G "kK^ݛA*ӛfovZJxs+DCYf#fL )h?V M-_|AY̋Ms迷0.Q[Z\S Z(![L7>}`uJOcUzVe|ES}`K 4wIQG;j\qۜ!h~Ũm.9TDO 2 M@%jD ;!5ɑ0eDϓI)H# o ǹ*D&L-M eۂ5LҮpn,W\kjsxyw㛗Ī χ&+R*B: G4$1:d̽<*%*DNj*tP& Ar!4`!D)gb$.xupɨ RL*H F{e钲x^'InQKM Su%+3Gyr4E/(0)d -$WsVDe X&g)KokZz#GvemGY^қf9^HD^)N D8YCJȒ0<(^;FH%TrDΣu8K0͢B'@iNP{nmh!^$B2L( < Pugݬ_?QvZ怨iL$-dT8nI $rp" τ[Ls;ܷzvqeh3X*Dufhδ)=j5 U8Si-v g9R#cJiK򞡯$6J X^XϊzKjc薉{tdE] 'YO UJϵΪh}j 3c'uRVʻ c穆ܠNg*z- z{ {}6PC6wG5WPl܂o4Aojfpv>&' 2A*z`UF:Sq2 Cڜy;e2?jEҡTŇEy9YZ;# !񤬊ԉ*ӆZic?Le͹j(^>P魧&Zc\.28OJFbPi_Ĺ(6vK+;reOu?8έ÷ğdFoFpߒ-1:LwP9zm>^5m0Aj'!U/_J+ ATE"-BU0p*Dk;Pi.Bl&5qŸDh8:=G4b3W82BWW, jfd5ی0gUf=5̎z`ٚp&N >zL 7FC>O7$@fA\J,zdfȝ6_ $[N;v[n򭮴T-n $}ʃH0oz# 2RP`B1)J  "JqGg':"o-מ}ݗ:KD>~8AI)4\SOq!`(͢r60e(8略b\Z0*@3֤)Cr{0I 3(љϹFP-z w6?ygeHo?j9njہ#n?MSdDxj.4Xh'jaJJ&&(ㅏddG0|aچ 71`BZQ)o<ۄɲ`)OʭrUॊksJV8cdA ՞y4G_yl81r=C&"}|  T`2ch[1" B/@.*(zU[wLdkjك8:S (x ^V}1Оt4цj{F)I1IR4R8KVzW&ǒQ+͡?F"I~G{]} 䙨@>;돞 B9QiTƙ2 L(>n"G~bæ!ysU+PP<#`A CP7eszm(? M֊E?S2א iow&ٳQ{|kVչ^*xR' Qc6F"BR$&L"$ŕaR0|ࢋD:υF}IQ$2FDwxpw(Cd(.(CS0$29[> NNC㰜NCB7j28$$)6qN![ZОd=*:4~NxG4*Y &(lF/*wzaMv:g98e|)θURJ'=W8͢φ}E~浯/Pȷme+ćhV|msRh>׮!qeEDM4VEmK'AQqwv,?ɆO')%9ģ͸r"\j ^#æbh-RPH{>qo̧GNSUN.$rhmL?,Jn,v'q>:D+Q32?:jH.gsԂ}T CuԾ:j j1ͯ҆fqq0h, DN--! A{|sZ1{b|a"9*PɻKH$jH*Q^9u0QCP7TU1HLPs6 )K$)P5ܢR)cr`v0r<9X+)CuDx$aD8hξu:jVǔ,A|k3cO\h sx9lA #F:*Q$oA_!Z橏Ik T:6 QCQcߤg9Vz70:5v4M泛:KL)aG㚜A h航[=n.ʛ]%pKF5JL00 j Xgu^dXMҎwQ5%v"ƣI+SRGMH]̹9(%-LsT9C7a%6<NCVl`;8bK%A΄]j10LWz`=9%NŠ/R{ձ/RR`V`I9[(HPܣ@րVQUWoIHesԐR$+zʚQGdi/8 օ$TGe>9aw4`8U"%8HăD\! 1-Rbi'"hɝ2 BQ[r8P6H9D4"npjT,e94D%U\{Pc_M>ېI[1gЪq]p;8k"gmjT?㻋 V ժP8_\F1E$7ǯ͑웛-û9EaBgBrqPZ2m") C|{)HN*.nOwsܩ$Kvb)%aFw}L m0+=\$-"A?$`!śJfxAYBnɅGX6XTj *e S//f6s䖔U WnQ6d6مЫw *>x`MxVlN[%i5UUB>ܭ{]!chgpiM:\Voo!]Sާ$QGEMӑaPnL(({Z[U {P-w_8a+DK(8-?I,Ⱦўnh ]-N%r-iMuL]ymZ9[6?ku5iri=`EZ8vuCTz^i~=̦5uw7W0jk~+7;vf[)8eVzMkmۖdHO֥'V2vYڌ!?Oy䯦ݏeɣ:U gJQ.Yex?}j|R;Ud'@((p$F' ಓq4Lv>Ӡt^HA ?kW{)MB?NЍz^˝7'U!EHt4kɕTݒnpBFzWQ3ʳ4!P7ŵI;K8tV󼞜ϯ~-Z|\Z]VZ,m2ooTr@wm]4ͳy ?f6w~o\/ !1bI..vnW $&?/ \^rrq 3w$t7 C052bi-y{=on]H兀{ȾQ{USŗVـ; \>?Lg>wP~"[-N-0/'QՔ|k[p6AFy+ʋx}9ήmkM#jͣJ\kq4in~t}L_z}w?~~)ߞ%2>(|0  ;`= C񺡹m6C>u;[3EpyUEF aV&'gu+ד| "^\6RHqhn  p}tC'\ʚ{OF=nG1FnV6%r[b 1,9*SJGQ:u9ש|q/ႃE ܽh䇜|nzY.: #tGu[|P8Tqʽ 6FF/Lyʋ|zDQħOL3eb"Ga3dup﹑u@vMo^n.Un4,zGkIwa ˮ[?#mzv (A #b$PR, <XC#w&@)Q P)AM ֞zlr7p{i+bj)b1] .㤎[;B!QQ6NٻHn$Wzndf^` O[F4-_jUzFjO{yCEfFFKw?,SS9>=#y '$Brhі38'$7IWtXW/Ț9_\y ͸ - [6[uw{wԓz:WӏrwD B/WusܲM/_Q^DΨ^c{ywWix3.']x/%&}ũB{e۽z&őp\n&t@3oyez ö,|eYpsf1$-5_O{T_vo*K/)= ,j2T̔ؑ / K%ӭN}(u|2rfbҬc(vs4oS J|Hj|11ntK-9ӓ{ 6Yvr8n5d16buwv13|#6f6\[Ddトd,9_"Z{(Zb2UܬIlwRF4[ '@>9A"$VB'x68'FuhZ hdǖS;]-ǿNj9uqN'ݩeS9&LUϲw-{LUxuǬ`0i3~`+e@_xWV-#_58Kޜ(kI{廱78>~׏CRP\oiȋ%2oQr+( <[+p6-ث+K공o %u04_1dGw]?z+Nv&Ͱ|v.WQp3ts^]Eك~ً\LŻ㳋'_}[5%X5@\?IU p?bOwS5y(%>eLfR/"\5$b(>#/r>G[Y{ϫt^|;9.wEg]*yiWAG!^D_ح5[t;Y̻Uc>0UTXg" SU-j֒b#gAOs5sܛ~yV\DrC+.p4;⮬ _@x Xo l6Sű.$W:ڢ%5o\L#^:>;8=9_}.wda_3}b ۏ9V~pk>y)%:B>Gjr]Nn_7왹Q~^Ww'6'?cɒ8eO]rP'>/o_EJI%Ew-L]h^]<&ba"O0?"zO>z.z>twyQ3KV3;>,ߥ5Éy퐓'Ȯ8Kwr;OɑCttgj+t5FztE7Ȟ>EbwI]q0Հ;ڧ.^r{g=]}2t%G5+wG] ] O}t5PꞮ>EUяzt5ǰ3t7[@KÞ>A A%]3t54^(9.=UسngjQWx{3IҕFw8WvF] Ѻ=]aqFM2x?jW33p?+{z[1ѕ43t5@\f~(S|0Beò҈uyxt4LtI3;ZL5&E|}G>lAn[}wȿƼO8t8;?9nI<,gf8 ^*m*MȑRqc?pBxmӕg k}؇g6g>XAx1S׵Y5Uv&"/@$(o2/3p@V]wZycV[&sc͑ooE%DZǒެTYٓfx>&Mx+i3{LYGhOۏg(SM_Xm:9].'zW{n0Y!d؊:öFnktXsPSf9X MȤj'NBʹRɍdZTCsg`[qkk~-ZhDZK!B]41GUZ0ju8[զRR>d'bh#7ɛ7!bwR#kM4@PR4˩U-BKA^̘fh.+Ɔrq Eǖ ӢS2>kpKD[3+MHlk@@fgmHFC*ølS5nGo|4L4J(cE{H pH{;oNB# TRAĂLC))V̶tAr]mN<mg5ɇEtnRZNTJ 9P9d>%LH/&MUXKqAeNVCI vKFh 1I99PT|fJȦm3fBRG۞:dnJ!(2>I0ZcC q$=5 Cm.5ێ`֙P/3ޅ Ji l`9rȨ-ψdꠢAm>h-i RԦGj(J֜PyU18X\Cɖ5![ݐ"uV6 CȆ&LV\WgDnp\FIFҼ1Q k9|9T]AXQkj ShgE 8J R1N-`RT2j*i!}MhcI6O.{[^ ֎E_i^EOM+dX϶2j3|Wl%Waq2M!V"( $(  -9HgLu>]o$$X,xϚ`&䎉 !.A %4)8I!0΄#W\+fo9@ֱ*")IW=&:\BCTXqÙ0 V :#.g؃VsHVdC qkQt,Ѽ<0{KL`Re YMDxQ< chx&puaV% 9ՠQi%U䝘. ne0 v*"}Sdިr]D,$ 1{*#Ⱦ ]{ #CKzuiC*`)@^s` QL(%@_AsN3u5d*L `Y-LhB;SI Xq?WR()f)fmqt3E+q{-'h֎YGنI&529 j3+ioTʵr2.U{Y"IʨIX 4>GHvT fƅ!(m-i@yy8rvhWQ,iOۆsp&IFJ`0u &]%of a#uh0Nv=g'Sbq0q5Es,JVO ]Ę>J}Ye8pݐ0𐗨M} IsE5* vC\`͚r%5ASA;81fTi907y02+TOV"-fjᓛk~xl^Pg]䍡 +@ GNy\pnآI |0#ZĐ<RuN` AqT1ٕʘG5-Rǀ ASD O+qRfzPڬ=CɗhhZĤBF2B1{Wg^vՍ%fmi\x*ભW ܙ0d-pڠ@4XAtxaALBR3^Ln\2črv%#p=%zAb?k̦r!pU\.: "˱Mh]!8 4M]X4wQՒ-* v^p y\]!D!! ՠ\X5n渟\ދONηy$u5-qwm#Y~؇i! tc̢MeQ*ȒFx{~6m"d,X<:uo*0UFpYku%8?n Wy8$lyJX9?l__ T@-LuI ~W3vF $گ)\TDlx-"p9,8! >pGܿzJ,:*;pbz]^ ^ wT"yCE*P|>. ?Yp@]L7輿(O_}]5C)[`ЕEzlSE]9aLgj O & ШjZ}lRhuS+dV[7[7ׇFldʻ:yRضУy!*F=:ViM) m(`I7l$-.mc}|16~_Y);7RքX򝸙?[f՟?:s6WWPHШ Je.P FeYsKuQw:0$:j.}R1zq:,Pxގ C7_ ےTn~Ý{0|X*JLn:c"h>-o]%nP HׯltW>\BDֈj\%^q6棊}Zp:tقo3K^Vr(93QP?^@~1wovt᝛YͱA*͕r/'ezߒa2&_'ay܋!E[s_L&Mhxf|Z.lN'ᅛ@уY\*G7d )˿/gA}Z㾋c"()# SbrD]a(͢m`R˂{G%\; }Z]*Ayv7K c/! Xc0R7.TC`xo|0|7e2PGћἄyQ},˧Oo]ӣ_q ؅8M^ /7my voᇩKK0fA>Oȫ~-/Fןo}Ity 7HQUyt%[0v,²Fi.J/d4{nM}+Ľԧu0ni}}Ҿ[94K@%)z`78JN)s> UT k77 7;̧)r{eJ.v;|*/W)͕K uϠW?6 d<}7Օffb,EѨ+`[f/zػ0U߷_{U2:̀IWX}|Lr_n\Vf21*GV ~P9^SRs ͬaL @~$yj*2q×PGU@b8o+B ɘ@u+?N-iތ`Ǐ&Ƃo4 Oct5O69S6. e~S<˾C_]:ތhc[>$P߶2M]yx@{Qh l:L`dS<䶔0SoT@3cJſbJ"ΞBbnށv97Rߟ;$y5t&EugG,0QAu* MxsLb&H.]^ʢqrʆC@bWhBSrg?ݿ`.qBsesI!.8:ᕎzͭ3+-;Y0/[$~Ƞ}' !o߭VSNTǔ^Hٳ;No` &,P 0G5 6H \<{}V8:m1۾J]g/Ij[aޤuVMon+lա6LECm1nF+AS)T{28d̜;^;^\\<6 w`Lp"}$^噐e6L{GS iݶ31Xv m!Oe+s=u`oic"qB ڣk Yxt96]h'& QЕ}|vVo]層T.Bx+8`r&p͈aN]N-8bΪh)U:"D+398yQ^qx #V;'xVl jZzO}^ST>~z{,>bʥ~!&^L/]sdȬ3&,eR5 G_!>;J{IQ|G-0kW}a&SA뒌{z=y15fϣܧ[S,CJ›;|߱r{[ؙx=YGKi;t V;'$Rw夕*]o9Ww8Mw;_6{X`gO[,{$9e[Ȕ-$n7⯊ *R! Sf@O_mt}h F:kx,l8xqK8Y6wA8AstNQZ!2K9l$d`Q%%!C΃+Iܖm\վ)P۔ʙ$#&.2!GwJ{*۷tp># "0|y9AxYfI3/ѓ$#47Zc 2(>d%+<%d f$ ېd6@K^ɉCBOBH02z]|dQ.>ԌȊNFU}r~ӽ61Uɬ\ۂxr+A6wi3)xMb1x0;$Y7\}Ǔ\cgfY\+!L fU VSr`29c7A] 2w2\  "z0pɶ+^-6U Tٮ0.ǢqDZhmv`7Y"KM6E 4)upNy"9WՇY*7#YȄ 9$M&&E&(c2v|]FMQǢǾQVֈӈF``Hk29GFAe8st W, ݠTUh gLi$8InF$jJkH&jp5b5rk^'8W<&_g5.W/zQuzӋ=ZXT̜ADHRHn &w+En,R2N/D/w>-娧%=r7E䜶6b¶= ~`m`ā?q\5lMb,7Hq̇ƥĭsź0WxtEPAGR1nhl,{Җ؄S ̲!#+ s0ǽjg..֊iN2pWdGTٖ}JgxLlxl>Pq2Fm %me1ӦSr mxt:U`" lesz^OoG.33,ec wx,}4pچ\Vzybjs,ϨFΎ!H')#)B,N3or&U{fQh|uAqI!I/q(}U::/6ضqbbN@n 7Y4&G8fٹlE.ȉf+ qQ\ьC'c$cGYHA A4 1QVJy%:HhȔ%4㑱Q %5XeFZw:UA;܁BghJv&͗hE O xk -H.&򺒑r֟WfZ;9vy3!-Ց`Sqe77LFɼG ͇K>T\,k`C"`6\(ZJxIpy;^OH<3!,8\Vemx2i0Čg]',xm7ySmuwZ;/~^!X1L^!@IHdlAJQU *DR)%|A#,'AH$K:ʉ\bDv]9ۅxf' vļt,Iu%0޴ ?&&xh}!czfiv}&UI VR+RzFcVR{ԀCUo%s;eukee3FK7.D6F`s=c푇;z2'wEw'rؓ'uy96c;-67RnT.Y5*u9cJAVx!̂+ eD*zx&y|tx.>9-{<.q\{4m"#ŏC<ƣ BR )6Xdh3W]Rv5|/Z^ER6C.L*T̚}lPEbH$@F%K4]h.iq6:?]Z<Ц]ta+t>df>fGSje9:Х,xt4A JBk 9 )Agm8:Ƣ 蒶J1MoY$@螀i^(a:ºSK4cFR^F~"E2{KTN,r/-lS]TJfp٩V@s%X \0RDn9J!DUܡ1wTWSTy@s;.~q9n(Nz5;9 C8qq' )˓4Q,9Bt!Xd4Ydk*oɍ_iAI ᤐϹzX^bs:]HfD+O絚觟N'ק5\Ji^pONwѐ?=!F.6J^qQlnT9ivwoz{qяo+L.f]gƖt_<_L.9HՆLZ 7\, OcVrH\;2#AixR|Vjj8:_d䐭Uͣ'nԵs5\bqq%ϟF}כ/oOƼU ..[]BBlEң5]N]^=vmLo"N^__}w?sݛᅵ={ @eBI%]GGG #x0;C]5g=[95޸d.7\VJM 8ܴ`h$q}!xɳ,^-2=$$Kp[E$'QhriZvLNy'8'mvb9 )%p/N,G'Y=W9爞dxb:_1;?FI/j݅U]γ 'VRji3gѬmcG^<~T5rB#oʉOKo,di-N]CzL%лPKkaJ. t\bh2$X qWTmeCꩭ/~Ozb1Lof!\o\?:ne[a3A=aʃA/=)6n܀օ=Z: V5h1xES:]6̣p޸n0w}zTgVTOBJ<%uq79EᓍF>"!&'?}U?HtKe)ڌ\e QcJԇOhUv_ Qy#g)mx3&μޡ5톆cZie_ 4mEC C7<Ak=RNW~|}WA)K}SzL54kgBw؁ǀ T =5bOd,Z2֖vB%>:B:ȼ2;H'hCf$3RUJ=N]Wz{ƆT,8߷Pدy{SM>]V92eLħ%YN@rؙ 1cd'Uw~ٝlZꚪ˷<]MW [5wP +mg<}J.}uU4|DMy75Ï_*2.gfM N6((g!J6"1ޙ2)ȎV]d;Puho9SRg\ȑ;U:xN H{gt_< J!)wg6_%& !H$OkG.D;\pt9$dB4hBOF_kFmp &.b{T8&v'3l ½aE~k_oӯMaK{p' vu_d2*я.h)/r{3//j{~ViT(%XhۇML-O Ǔ?NItluUl:*:zێ%~ȚnR-ƒzr<@9N;=%u :$SP&I"cBW2J xt@|Uɔaq/I '(~oOlG`խt?LTL݆]eL+ ɰYJw8FSϤ2Х~"J\_mQcII}W6uԥtiZ*ΧS4<L@kH-Z GLUU0tMYz{Tkm-rmeKodmݡX =uV:ŚQHMItk:jm)9OԸhvhC:(!bZ k@#IԾTrs `p}rles/q*86(eؒn;:B;|(wh:j.Xw]Tҁ q_Fd#78{Cр h o%9c"!&(Ly>L<"7<$}xtDvJX)qR2ٲ 3 *kq63 >e HR1 8fud6Jnz{Cik\RB]k-!y|4bjTjtU5T +m&I%+ Ld'|wy+Nk,33FܳD2 Gw56a uk;U@4hqUsYȦL<:A?\Ěɛwg//n_n[;rPMlB2fT["r-REEaNa({W,88 DʑU+Q WK'J{ Dn\pjw%*)Z < wy>6\\RJԆ{Wtq<{lp%rWhbAׁ+s}9uC+ +?^W\VsDP\ݽjג`OA}uϚnglㇳZ*WgWO? _`|{6j"zO^_w[֠47zY*C|=$powmoM&$?΅]Eb]MP4<8laDo_eyi[VgV?7; WY8RV?W7/mNoc++#T84@Û7ov!# r=6Z00wAT+nW, 6\ZkhĂ B O{JR6ޕ  \-WVoUFbAlp%r+QH{{\J‚)TN|+t.bZJT/Z HtFVg+ \]o+Qq%Y]`߳b6ޕ~ .#\`[SJlvEmw*IW+=޸w\Ԡgq5N.W+2lJ.ڴW( DqJZ;D%%)].;wgW$Czߦ_+wu &lCپs[J=TDLX]_^t#v Ќ4P;Щ cpUMWBdiZ,?lwnFPk~u1Y{݉>E{ YLk*T,V{m FU'Nޢ7j5v&ց;ݚ:0 1~f#Ї|ҔOncУ(Xn*HNDHE"90xX0|p%rWZ;DR ^W&XWRGºlp%rWVٯJ(qe^v8m+Qe+k)\]תW(Sp@\9Tds Y7J"+QKnxWK!(R; Fg+OaT%ʣS&'J; DKR;D)9KU`lN'hX0hW"Yj~bWWf`כCe{W}'2K{8:aE * KIBaFʈ\M#j9Ne(Z"9CF{n\wTD\Yɻblp%rWvׇǕ[vՋXɻb0\\W~JTj"b2)ȥlpjw%*Mq"3•hJzW+?w\ʂE*LNkW!JB6Kq%*\\C%fKĢ.OOI6EԦ症?S oŬͯ~>*gIlA+B4XM>X᧳>Fv<-]kE~_j/k}w:ӦWUZJ.BxUws9k^gXL@~~{׏pO;<5%֯Uٴ$s\Fq{_Ȁ3[qs ;He}^s{Ɍ~X܇|DN@đck&ʧx_s(ߚKŴ_@A>&r=>2&v>Ӻ='Ż]}{#S1Q#Ft;CC}ߙ/yELr )r"p/קּ1/ 땾W/mvcKm \qoժ7㖞I>ueN C-&k{iYN~ ,9FilF"j1L?)r%i5IQO.0s*#\`0\\os +Qk,ZP-ʽ-ow%r_0VbVJTη\-WVkr:L'Ie+ Ymf +ۜ+c•ȵ&\9mƹJT+Z H!]`|prW1YPp@\yci+Ǖ7/P f*\@b}eVTYZ"q6+L Dnfgq*K"+z#/%8m~ *]7UڙpłR˵+V+Z ТX|VeX.BWk~J"[p@\YK9yW"8@6b^\p%ja6\-WνpmqQ56)rQ +rVe+l0" Y-(3w\JS֮+Kp%C>bJ3DeW U &'\`}Z[Jp,"\ف]oﵫqraϕƩq*veG\m)g0?Ƶ+QZ sx̵_7,ºk5c:Yk_:9[xFAlF}  wK &r3H'؅l"9볉X-&8&Hn`|`lp%r=+V ZWrnYW/+&" <3xW"t.+QLqC+X0a>r nb^cqe}P6`x0Nƻ~bF\-W.hA(\9W>,/WYW,>N.f.ji92%BN 7ȇضy˗E]" ̗ċXIH-8V-хKꮮ:$/=]8p?ڻIJCBWrzZ)=f)_QOW'? ס'{]9=)4] 1 ].is(t5:w(,zAt"޿ Z#]'S q-O Hmî PNI~ȳǏv2rI wfs1ڂ(kƗ)8ӛVW(sQw9}nW6kҷ94@_N6xz}T~cj+7??#_Tg撮έ:"t 5іgςy%oC1ƻ=)wL^{e -MoV! 8{0uP$@x%VQ)/JVY!Upp: {OW@KOJ*닚&>9;hb0ѧEڔw7B۷oobvޏ̈́|G0Sv}mQzP8R緳yBV}N&*۽ܣ>U -;233p@ϋ|}e淙)̏@1f34.GC==aܝt 骭j{f].VZx%mq-i=d7rdL.DZonlw!+ ?|@x D'h/\!l] n~ rur^ۏrr'*JEFngU]}0˖b6zC>2ShF՘= )cTʠVsե ݫRy1v:6zlȪ ܍+o]Us)ֱ5ysYUȫ;;]SI{k <RIIT1ڪTw$#W@x1vnE"N Т-)$qu$~Zc曄E9>%* X )$ZjC_2"$1ڴCԨ*ς|q{oў5Y\S>Em5R)0f;Dr΂U.E{u=5B }m.5SGގ`ƤUȗBi`ES. }FnQxW 5h NAkah:^vδy樌_4 ('έZ<ĪuѕXɠ-] 5!12ѕ<qP\Zlh 0ճk.keB`T h΂U#)-ٰ֞PѺ J(ڑkj o2 8KƱ_lG?5m54FRĊ ̆veBCl\LqI +Q s HMQn2o%Ce:|=Ʋg d,̲޴X =2UFnTzCZ2ȸALAAXg c= - @HhPED&TD;cۢU<B:nY{$,XY[t0wT \|})q$8TR 3ˁ Ukq*lL&vr/ōbUڛk((Sѝ*E4GRFyўeAQ 2#}[m ()g^;UW.#{,uYch 1Jui<"gd0[Qz˙JD;iKKݐev1!+auh#ǻ =sA །A/'t7hw8uHNoT U86ʐvND( /8` rz}i:U8+Zp q`c{kz.0Amz`-$ >:%@uPyPmJ_6#+=$]I QUF2bx蘊g< ġk=/^,J茸pQ4I"iyu,C 5tq,{Ѳ<04/! Oݗud PG-A@8/d!T?V y E*U lZ .k$ S`u0~Hv?ZlOt,;, MGFYICxS(m@\+w㫷x/GTY-w$aU&0l3{ ck:C#Kp05U;R@o;u2Ivl S  Bztqtj 4zR6 D(8wR%[Y:Zm5=kM!%8OՓFCoׄ bPOz~3?6iQ{ұRpPaRDluHck"* F mE7NJdt7 %0ed]bF = >J;HuXo٬7a; Vg7;"h$G\k'F4$\sc~M bX0j)o(eQApnlQGb$abRuI` tB)Ѧi.1Q` x4Θ+ Ehփ*TJm|g<%4T FٌP-ŸC69yYgy~{P׮C?j_j-klpg^qE DσU>.:jUNàe6Mf@ 2񽐚?Д n FU2'ҢFp*q˱ %׆.I֠[ 1xx@A4Tm-\ *wn8 b!C1+5PcCPdpIԅ,q$@^[mT x|b!(92L0H5oz:۵ylNk;)V … r߭(ZYo}@qzw>>*V(jUx] +toC:j.A-m/۝uח\_^ocwyOo\^W_ƀ\rircvHqrg r}z%]O7i @Qџηmh]piH㝟5iz6r^5Oғ0zw']O?NGL67oH 86y3#כߓgڜ^=vy?hU<ҴBvChۍCO'hѴTK+MgRGm*4;cML|!Ud`p,1e&&d;ɠ^v,[UW$`=.G,h8M]Dd;!lr5GY,mͶ2^=,1"8j)H09l*Fi]#t'V9I 2Z⥷e`^'s E-K{R,.)v`}wUb/5=*)7zmo2/؊C4= }fHE b}fB+C[!`3ĪM@!ZCu#iIS]$"(HyKRBZ6WHWBʗA<5tU֌6 ʍ̎^]y2֦,VEWS*hi:]SWު,,T{BWIWZ LJ+I9o ]5tU>wtUP6m^Baݦ+,d{f \mBWVBǮ .UU݈B \B6 ʍu:j;]ޜQ-LS .x}?Ͻ6h]=ttuhӃg࿼ /l2SR׈\h75PwiǤq@|mB-#";O\D6+aeFk=@3pVoޣ(f[ `[ \ m M nxMaꢫWѶUA{9PnjGWB/+Ek}d Qn*֎^] u `uUm+Dk(o:]:zt%-}M꣫Bk riBWM+D)ҕj"RVPٞ)W-{R輣CWڢL \ٚ42^%]YJiBZUx[誠m~0( 5]}7tewlz{F`KlOw\'fC)`+աMXk*pn ]!ZsR^#])7Vk5{nZӿղgŠ_Ա;. I6j- jgχP>ו#׃X+8 h60\lqSu-?%KZV(y2z8|+ipF˝^<b:mgWô]ن.ffx)I/ȳ eI \[NK*8OH9KR VgC\N^A;asĝ@i uX"VmR+̆]I-*I (g{DiRX}J(1>CO0(i643E~鈣☦USkͽ!VtӒ=lng22!m0̚`'7E4D*˚ecXm I?n61W|P-vp4r$r FgrN3Fw^2*1{dLuF5J0BVw8z=-kps?=\^BK⒕nq\%$J2!ˍ;QnH',Ll<9\͜"!4!b85D-Vf 'QOЌ? = 0i.mEvx4b܆ Nu& 2fyQI(G7~\$AS2BcMAQ\ b#^ Ꝙ2Υ ܺi묭TY!'Fxs y7JSYz68O,ԘI4lң'S̫⤌=\\HbhKĞm:I<]~8DAF|&i6=aDQDH.e#mYJ)׃uVsÝ`'}N$}O٤ϝF9(| R'+bHϙ# (deD1"/-苉 Ef͕6,'H2;άї1IN9͂-2W!FSbHfF9C&9Rcʸi$1V(5c`,QBX8jɃ>A[,ҺX<.sU>^%ԈNY']YYTWkH0KW(ڲ#1Hg<#`dUϸ(jW2GqP BqM>0lvyD4q8JPfVpïM={msw\nNFr*3{GӹH- ?C|Idx3c]Be(U!+OBkwRe>RJ(lGb;Tu,xG5b10_Zכ M0w Z,j]aNIjufWṿuilљhkqyW|tD8- $Y|lժ;Av;A 9ڰgP; G̻|:T{t#'Ew=\ta?綔Fj<@b˂^[wruD 0& 2Iτb<QJ:&[#muw$bТո"m|#aze>w{F^W$-2xLzų^+{ۚ~6>Ʒ- ?az7q0}i@x[emPwK9v]G+ڈ D2F7z'ѮOrjT.\,(֑+SBJ<29b |p]7M@4uBPk6Q`SoSPoicuB^DEIYP&"e2rTfs%gx Ը w`w2=dpml/zCwʻ 關7vn2 My29_o eI"sr+`$*Fr! 6($PΌ|ڔ~J-LnsoDmY84*Ѧ%ϒ/.N?8DJ Lj}{]R3 A!c+5Ëj&H%3GOu`L&Fή{idWTxyK.GMG>KYf{[lIA-M&g <GoZXҠ J x#n^#M CnCaNGCP(s(łr,ueābX )I  EyzA,uqstqsQXmJ)8^ 8xN//1X75LwEUCp6$ٍva'ٓgn=^_̩?J* Q(6BWJN 8FT+ =`|k}N;_gu_?cc/GXܼΦɫ7ol27gI9_,K7o˳0O xϓ:?>s[Tj;EW[!vd4Cj.9e> ~U}r|?8[3r|fsw/^\>Ku No[.NTTq%ُ{4qj׬C QFgѧBOI,k| !W.R?jz^lbd1y<]_؊fH| {6z{3VηbκWݦ%豶iʃ,ǗdݯUp:[ F Ӏ-)~a9m[:[?*3MޫHF'WR 64} #m܎eZ]_}Trӕwȧ?MSt_%4+4q\+8xޢB+_\^s_tŊlS1Ki$H9jFʋ]KC+_LJ;}P^:] )Ϻ8@Polua|RNu)W6!<0MGPQ1]q7PԨkͦ>9ܓ `cP&ix`t߶t;&i}yl ۄݘ1eLLYuU lUٛ.v5)>,f_ z5Z8ݼv7eN{P9/^ ґp^pN7Z[g(&`i`^gj%@MGJf5U(̅,|PExn?>J;GAC=YNEq׸Ug.S$qKEE4ӑFB 3^֓UpkmMWW1ӄd9 EΰxT=lo%Hw 5'iɒ馯ov.Ɏpdu-Ifkj tT]Tm诽QwX;yaϦ-Z%[{ VK}鮙ӾZ#ɏ.=XWaKw)M =Ly䯺ݯ=eɣ:GUTaP#Emq}Jq~b&LY@ 2.2G@'q4s+, ;sbŽtIM LrMhp٧*\LqIFHǁef(?eC9/M 撐37ɓ(?F'>+4T@e9Q8l儝\ 8ܟ@Y+79)W'qb^(%:PEHt4YkUTߒnpBCC^EqSW B #MBu1ތfpfjy=;;]\]IZP)mŤ<;!>ZzG]^qnEj}>LMQ't4-vjn9~8U4y\=ZMзX8oEyO(?WuFq9MѵQԏ8ַG~|}ioGJCX`>ȈY6fcl쇱!iѦH,}#bbK]}2Ui]z2#|W_/||/_|ڿz^~%znk!GxףyGMY ่,Ͻpf*!}=a8jDq>#'֕;U{L\]vkk,5Um q 4JG8,ĥtX,Ȧƶm4AlH[)5 lu U6T[[AlT˼U{}svVI]p|n Lv׭KwY#Ԫ{cq:?mG=7w&?7ڟt-aTPT E֕ksʀ [F2ڤer5lڣxϵz_NinK>L-HuSg-kQ Q*4Դܦ ux #욵Zi[=g,]ݯg;dhfU;9=^=.wvM&w<>~Wguٯg翟i>[G7w$[VFFkX\۵>>=[*z79ݐS/|>ҤW<ę|"i۠Ϯ0Q囃zIjwdA.5Ac6Vv@Ī.lBʮ5g5]x{,W gG>ooNO/V:;ق?l5{Zwp|5Y[8 7e* Z Jk˖i: (c;.X6UB2 9",I#2Pu:lB~_5~f5 V(5iGCY!ޖ6. ihFmK-FZTp6\rpYuqqy}4k x;Ȃ/dXpng=L]=!uԴ-lkW*@mkύ:5Vl0zuh%EHF/$;@p 7$26 M+ Z ӍxuW4ͽـuɺdw/D횣ZL(WI7QԲ9 ZY-y:UgOz%/̅e#5оyljrÓgOfG%Ϟ|˟|]^UGkߜ_ fK 97_7n~h-vލҭ[knS2)Ki}g9`lH|LC_U[NՑ{/;V\,&e"FCje'>l7M'fXTlޖﹾy7Y3;`vzo]!uG1 T=usg濞}#ZT:}ѕ,)^;ۛ},T+;:3,+lh\ׁr`k;Nr;z3w^L3K 7#|`^rU͂JG<5®ܮ ]bÎ ɖdFLij.J 1r쪶1rVS>a%Iܜ#~6Ƨ~̡a".VڐPąd".# 0G\}0Ә\10%#WT*rŴdmr%\P ɕGfp%!W !Z޻J* BHi"TJhk.WBzH fUH՞~UO\;\=t_FQ r566UUBr%>b\L*r%:zJ2Y(W\pS\rIL}rU5o]ACEͦGgVK7/J`V"WezFDHkN~=,/w>k;7^ZW6ZR@v[4݅sm-'OLO  i̱cT?\_.mN8Qx ;Bc& ˭Ik B2~Jf# Bi0FɄ9ґ+$4"WBk+tjrNMH$G,#WKDe6 ,r5B2䊁ܣ\ v;ɕzr%g\YԣVv+6\ {\1%f\9t0 kĵ\ mбSZ,W#+oMj0ӑ+5.Zoc+**Xo$$W MFa*rŴ^EJȱG`ê#SnplrW 6.rYzʯtڟηyKj*//N&~`5ĝWqo{j[3PWUV)3TҴASi` nwb+tb&]&yw2Srat͓[dd4]KޔP@]A EektU\w0_pesd;h(l1‘W`4}mFr .Me$'&>JHn#9Q\ gu!bZ0vJYF(W䊁I'N.COji~+!,W+_g+b2r%ޥ"WL Pb Qބ䊁dV!~0hyJBr:ؕzHEnwʼhjrEJi0xWk P;hzQe\PC+HH]:ޕdLKc+곑+a#PcWaG갪~V{ȕrmՃ:^hґ+u.bZ~+YF$WV`4 ZN+G]'1&haV~ĴB=&UgSlRTMZ,%_/&_~Փl'ɴbs:5c3) .UYly!Z#>wq-$-t6 [ȕdJh]g\R+*ܡmHɱ1ʕmJoFLg0(Ƨ"WB;B^jre@ ɕ\1n@LEֻjTٻ\9m䊁dJpu2AcWBw5FV3/W'niJ(cڗQ*&e+v=b2r% N7F,W\UGЛCz0 DG#ڇ= \m[@ΡJH+HFWTJh.WBi\RL+`(Ujk1CngZr.TNN^$, zSիfɹLC8ˆ)c YGv@Fr < nTFrL>J#1аۚ\1tOC*r%&P:jrEO $#W {'ڡV!t!Xp:%`璑+ 6bZ!vJtYF(W֒ )G,/\w%>.WLU Q1,IB2rSȕCBy08FNҒt+:QC?Z}](QU2 ɕ{\ nUpFL %r\цUOGylw >Zg\ԑB=䊲\m[rT0 m[Jj%qǘWA:h"^q_^GVA'7s^p-7P\d~`EQ/`Pȕ^U˕PƖ%գd+@ȕMEJ(W"]Y#WH\10\ ȕІ)lʕ N{\10R:s^k֙J(r5BrJ!ABr%+%\ >v%6{Wc+)ŮtB킛NJh:3c#\ pHg0ȸ++4ٻ|lXH)0(puS?JY;CLm ɕ{LF7(\ -Br%\Q,AԉLzFyiVo4e|2YYz,%ŜB*ٮ͕P[ƴc=:nۼ,O lNf$'S1-{si#9&$Wzy}ɕຐ\1-/WBYEHnUXt;qh18Pjre$$W IFL*r%˕PR Q,*`=dJp׏G/WB`pr同Mɻb` n:vu.WB\Q<:S+Z%#WɄڅF/WB\QGHH8tvnT*>xJ(rȕݰ"j-Ч%5`PQȼ+ClmO\1ztJh:elgz/pɴ9Tza%QIdKg#&Ta͋ 3 qB*nz0FsIr1VKw%h^rųr5"+%$W _$#WB;T>T!Q )My1VLy .TJhfwJ\AE`$4,7 55O'JYf1ٮM#uz~'F!:ls/N7l}4E^%5U~to!(O?jUt]-SL+zOOy]^i{[fyȟ׷piqu}~ƲY ֒Wmy jRd oO߯yu@}*&+j}ؼ7oϭv[R'Lʷl?k@c^Tʖ_nf# -_{ײi8,Y ߸ŻyX~5gO\nQ{7\Uץ6 mw.] cSht jB 5%J${'O]9dBݣl>m^>+89kw \*J &P179UҠnGK Ac7VTVM傮ʊP9WeFUhʪjtݠ/l:[> F#ҀzeogC[7 S  \:@hccqۚRU`mj.sIBZZ dIڶl[P핯 Z 4Bs0Ҁ VY BJo*C _A2hEp-#1)Բ ,ۇYе{gucșП]l{,U>&z'1 [(>sAEFOm {Ze3@:LAZ!$N0C7a:?%J`jMPI!0μʨ`=. `ɦ p \AQ{+TtgJAQ"Ł.0͑vEUP 7k:E (Ez@?PiudVW j{VQRA}k 1 2+=[t u@V#5S"zJ(k! e 1P 㺝ҰI&v"PQSK㈽ Aw ԙ1Oz[tho{ .USW̆KHN+ .uR( / 0oban;/Lb'^֜_oj[U0lh3u[w E m3bka&a=C /E'y:d6Bt56b ڸLcuLC]@N`=rE ]!8 ) >@(E&rZsȼb| b1w g#Ѽ<=K`B}dH֨Vx$ ;6Y,TG7>Xż;CmG5juzui`M0t㿟OOGy'kWS,UǑ+Ob f/BfȈh2 A]j%mЖ /fsP6DRTS@݅ZEāpaF5ܘ j#`Y/5QΨhv5+ڱ!,"#PR+P:m1Y5H#VF/QG̓"Q0 td+!xn63+I@̴촪JK)C2~ȃ 8vGyPMC כYeXEǂp]#ΊSC!ۖ55몇Z2ZMռB:kϢ;;kUFP[TZsC{T* Ғx4k̎IؔM@[T_lkƐoyNms([Կ i{d}o97pe=?Y0'ttYhl6s>H(vVxY:Z;5WZsLڌQg52A[Fm1 PO3z~{4|}YeF٤#väDy SַJC6G=T3ʍ[X{dgr5wn+2T8 u@gj*g':<XoެŰg6Ճ'+I$h׺)n zY9g]T ê Q G ߡ0ˢ#u;:Tݠwg= :H9-й ᧀFS8&գ-19-n-k+fnQnR;5֬Y u(P5jI'Lj6'е- V#ϻG&&Ko=µG],zC=*mP zx7XAAi ֚ &-;PZ #Wfiq#0YJF$=kikJ’ # @_uE< \T*K>g]iT> ].!b!c-ٱ(ZI~deśPpͻ %;)Z͘-, rw3B78b!(\%`j?zF/n/nY7W[{;9BEfNS)-gߣ8}սL,QM.<`2n4 u|?> ^|\bٕ`Zzy rޤ])7U^)L'mN6߿ _n7_ǫ^_lp7/ qh~#o+\_n>ꂯ9o)ǫJ_nܞmG|7o ڸmf@g@]d_5}ߞ ^rNbӤm {"Ŭ8 qg8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@JKzy@,å<(Edw('8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q`@(Q', ĀmX:' {b$Nt9F'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qp@ MKrp8yZK{J tN q $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@:'[I9ɭd:zu[M }<</ɸc\dc\ 7#goظXX]sj)thCwbQvtTX-M_5^1\ |7w-|Ac콅H|U*s#ۓq}zF|?}vy]KfrV)wSd۟4Ȟ 3vmSnL4_7G8Sǧ(SG{;5PL.0ǮѪqqrwvL5mU7:+FBecRmb3[[JRΎnFk]\iCSxxn[z7oO6oV7^\eci=)-/7nN]]8NM aZbIǗyWޅC Gf2[Wwxywu5} {> )'[)Woꯟau\4N12ן֓1{u|ow$́-"nϵpS\"zG("."‚ cX]1HK+MO+FiҕeoZ]0ybBW6- ] ]aIt2Z ]1\BWԾQ*(8bq1tQvtvutE\]1ٯpbm{uE|!UT1%`z1tpY ]1ڰQ$tut4Z J(b~1b~3t8.ճkg?e3uAWQ'YgWWqhwJ!?8w[̻9?Sˠ {D~A |pbrz!tY#DF.SnMC=ȒJj/1j@x,i ]Ԓh1ƴ`Sϵ_q(C :DW2zb޻)tu8t,UI-7hbneBWHWi +kh)thwbBWIWÂ].bOW2D+rQY])zR֓ h{bF{˖ P_w3#`lA]>] o^z}ts^>Z?.G|2o^{O.Gż/[ 0.ݟ]fu\mzj]ꃆ<8 ~SUgk$wN[mK(:UmeP^!?]bwmH*^^+[$ n=,0Lv"hZgd'寪%۝ʈf j?d,VYMӋU/./V4IJfMh73"`FZy@5n4Gkh=jC[ۗWOS=lgml_2SЯ[ie!0t~2Tn'Cц#IzkZ<#B`P\e#WDUr.?# {> v.WQs 8Zב\Q*DUrXKcS- b"obaXt+fҶ`6&}Ey~͊"CcY.FъIfECFrY3h24+Lwrg;#\]#dn(헂7cj,EnܾݚKt7&7zbEwRoxs{sJ;4W W,āz L^5q. ?Eo Yz,zquw '-;hE慸#4V UM=~|;xT99P&$ݣŋXãE^5qfC㥱NۂS`'}>ǝ#9w<6|W=M!p#粬>k-ŋ~,dR|Ky6Jk vcq&zT=B=rB9F`\ 8.7BH&JSN}k̯TR/3 Hn/v~wʭiݮl])ItV9GOw=}O)gP 6݅kՌŸ |J~b!ՐJOQ N*0Vzmzb}7nS[o6FǞz8m/6)z2+g X&Ԋ֦k6q%TdX{: #W4ڑƣOocH笇o5g`<,p]fӓ ! &w6ݏ|/Gl  ^lanUBKN Į).rE!78Hit\)s0(oU$nrGD7rIX%Ycz ju3g|Zt>~19[vxtVK{ՙ#F`ۛv L`7d'&j-SDK#CU#B`\!n\%2)rG2TSTB+=i "c~;\R 媇reEP2+>vGeAU/ B0+|bWټBZQΫr夕wJ+µ\7Rj>ʕ.XNv/P|bW Ѻ(<4r*X3]. |(!/\=M/GBY; l\Ǖ#qCg ZQ"Cejv!W2cM/`ɕTA(F*.rE>yBJ) U K,WՅ`SbM 0r7Vcd.`4ݾ0qVVq1U*{U:(SڃFlX|V_m/3 okZ% ʎ vGKdPr\=҆dzZԞi܊9;-NzO-̟khA>Sy}KJK;z@7Lmjw=ۭXVldp"HD6teZO'Wb#Wk-"ڮ䍡DM,W=+M]*:/q..WDڙYN"W֙ $\FlrE>{W}+ph{H؞`TrEs+wEFg\9/L”e#W\!ϝ9\Rg\yOQZ䊀OqO[ "(rC oFBk$2(s3+H zRQ8ܓ\vf02$jr\=L@K\NVxr !ZR<4F\c(#7f#3*eeVN OČQ!-4g"Xav {جW;.+9WrDT^p% Sd#W,RDW(rC!(ɻ"`ǻ" jiS+L-i*I ה^zFrEγ+]QA%]ej\D.]riNTY(WNx4#B`o"\f1G?p"J%\PT`#B` .rE\rG $WWu] =V'$J\=2{ތ =aVBIڎ=I*Rvr+걦Jp lp"WDQC>.zRfϝ&M \;MR'1 Nw{ ?LAGUvm/ٿJHo `0l߀NT. =bZ+@KRJ\PJxFrEruFQQD\Y FrEޱ+ \!ɿ"J}+0Z\D\-,V'΋(m^ Q^s8(>ZE(C(WހלBxWp+:u"J+\PpWblpMhK]rliz;M,sv|8\q=n*ҚFȕrXKv lp"WHkJ"U DR\üR(+]R t=)֠ucAK EYY9̥Pes0"fLk@zA5Y3RZ`熿ۭ@@Vrb#\븬֋WrDZ=;JNQ\!\\'XH]Cʕlp=89ҶITR,W=+ \!0c#WG)\rGR䊀;/, )Y}+йtlp%huQ媇rwS'|*"\ѺCDbrIII'\>+uډR4G`OHx+QRvO]QBbyW!W걦 D+M4-gwDN@KĀtZ+;a ꑱV)ٙn2O77g7Hԩ D^ *Ƞ#"`GnUhM]f\`#B`"\Fֹ(rC2T\3lp"WH`1H:U劲 =w^l@+u\mPRZ\P&pZ "=n㧒+8ڐ| )!U/ +FrEֳ+ul mRd\<HϛAI'@eNd\Iړ+"\gȕ ej\ WnOӻBn(t~a>ի`}hxnV%#l݋~.Jl|Q* 7! Vq߰ڥAݻez:+Mg>K$8 &G9z[W8r0*v7eb1rjHhUԉhQ{y[XY9U ?lULo/A5?kKUvϣdVwr~'z"Zzq1X-3ʟW+j_ F0~\a{2| ?rS J]*sP)EPL*e(/}#? oa?G! V yb5ly/mn'k%Љ# CTOdFOgFt^r(|*Շս{dBE}V=l^NaˇH_*F [TR՘p2#^92r;9Q (+lkF>[=@ꁔgV0xl]D۵^|{,p|\ϿxD?H/7:(1wL|>[T/..itiF@XË \LU^v7j+Xgi-Ol*駳GTEfo }}cMvhFo׃ vW=.)vʮ-ekwX.[Oh!%NSH Tv)b+@2\TL۩l'i6a]ÿ56=:mO^?J҅NKN0/A(3)H}]Rz,W:vlӵPVbC'.M'ֈ7`/9  3Jp@zlkrWͳusWwEIMSP12 z0NM_Y RY6yt$>o7!τӋrԏ7NCT6ʁgͿw(Wg̟n(L[lK9zpOR꩕>54!7'i)V"slyzc5ģZ9q9X=WoIB)%Tf ֎+~R{kb<)'U~L7N&c3L~E[D=ⴼm˰yZUTwJѧՇE@7)y)cb>XucY?t{۷m~kڸRKw?,d 5,B' +m$GXj alyڲuY]/3e["E2AW[re#qh@ O_c_Iy~pG>Rj,h\Dv=- n%1nKT&5.h 9(9qY6n-.'CЉɚ vA_r;|L֍V>]qCR;m2b)bN8!mP}q,0BeқΉ8?;Ս\ ,}t9<}ݳY6+HV8FB_lVfj\gCoL;%x#."kl!GՉɌ}Y밖-ס:ԏwf)*$7f.{ӊ NT&5-8k/=. 'prK.g2ew&8dݳY8|n۬%.ǩ:ZV@|'eGe"du)Y\׻X̜*Q ͩX%8PRf@\+8y.% {ҍk.r:U5=[V@Q ẐZ`TeTkta"OclLYZyJ`lj`X1"{aL;aV+_u,^Zư$83blWeDB)3sY%EsR2nYڅBee|`6dlv E|^,\|&@D8G07.VCoS'R%Gg[V\5喕kJ rmV\^]cA;};!/e|s}9$V2Mdb4ğ*^F\tF%8P$@l ف'A+bwU`5b `%Фdy"Y2RDIȌ ʌ$(ko%n-iէ/W&2h˫O다/ 8p9i걛~75qT1QƙqPsk~7D'\ދZ8cvcEb:u=@$ h;qV!(зem 8 QLN(,l_!}.3=S,uRFtolaj#N)%KdIL qL[muhv@Xa֓.N T 8/42enĀfp$Z`PTXj(ee][VzO\}5\Ĕʾ>6%Ш颡$jIL5U TqOe8IJF3 0FgX0"dy;!h²(qfu &>Iay O/{2T.fOEt]]7zH̦ɒb [P-fU9g?̜ ~|CݐD\E?k̪ Ɓ54le1$( gUA9w X O#,:7-([現F5/\re/ZO-I@#咲Xpʫw\)Y=&45\9սahnIzm[Le)Lʸ^# D`GhLbϱ/RDN8H(!8>ǪĐhZ|7(LԘ)OʏmJlsD@B : [0cغ@oG[-^J-] aӛEllHms~L|奷k%z alo܃qxE7pjGQ@J]Wyi6Mso}i.{Z O20!Wz>+5R уZ Qmo=P(Xխ,GY?ylB)@Q>ˈGf;;WV%)4Qғj ]}t  ˎu>)mL x1.e܃/<)0hҒnoß,xJr&h9^e 5mDeYK$'9'{gߦY/ 4Ζ1'bۉq|P*L3t"/ P:g8Q\F1[7x|nh\>s.+qJUNxr%^D߶CIc1n{ET wɪf}QćlE#m#9%/6Bbm=5s<^dY㗢C?7ek( RUD s0+b|n>ǡAcZTNj5>NqG=@^;q]h+FپPHNyt]sT q"4ѶHf{8 YҔhog!*T`K 1?H6 p|u}1g71Bv\uЖJ^oO(5 H8rC-н](naXzRA"J.x TdRp{*DIJȱpn߼@B8m%%tT+Ӈ{?1.6x۟lߘMP4,2'{7uCa샨j=~n`y=y@ oFfk&tmsm,_Bb1*^*gI:k^G~/MG$IGoMus !dTVczĹz}O_D暀Z գ/pJ8Uꪪ0ɲJȧS/fp㎼lPX",xmK/_BV~x~Ajz"XNVs?:x퉲,M~8Ce5޹- J⸛s*}w9c]Gn@l"JS w7*PH[ _V򢁠@8aΣ}an|~ܐNHfU]+D]{ XTށTV Ϭ%TY12/V̯S2ȵdX^ +K3 im[ '0*,HәNt39 YUHq|gӑ@s  e?M4qB$cqCw>LV|0*#;7ӆnoB Z4@!ŲNV QRRjι9*U^¥3@Ñ񢘉5!śpB |,ٸ؄U259II\Dss^<)"xH@##2I,o>,8V q+;UYXd Hި[$vsHFj&F s"Z1'8xm#A"m8OSj=)ɏ~ll\>4\i@8pL'}56gmS-3>/'~.їqyL -:h}B樞o,Uw2Dx%]@)vz;*PB\U:&-Y2 ,;+hjsht:}JyݥIA6EkN hNKw~qFS/H\$sP-/8Rt6.XO'ҤZgL='4$5̲9JUfkge&8 gU,plUᖈjA)M  4 dv;[v+#|x3TR[#3 xi†o9 Nu7jqF7HN{9D,ڙE{O(,ҺY,E95:\!In 2aQJTBBjVw/~ˮO݋[Ȗu2nX6|4fURp4aS4qzq]weQt< 0ٻƑWr~`$ g\l^lfIx&OQ,J"ni(^Vu}ߵ`q}OЗ6+Uڂ(7QJ:26hRq3@rWniHuEP j{Mm$&_1KkJҗکBy,7i I.y4Ϥ^b,HeWm$Djb5oJB!-Ny԰4Ơq踡5quHk84 m2B_%T`z* _짏 c[JE@a'dPJ+)SHk1!̢CcZsTt3E[TKeQik-XGa\ dsnB\)^`})e$񔁗HxQ^P=m˰6<B'p^Z''xb K< K^N(;™lѣh}^MVmV&c)׆u[d7VFN;#R%Xѭ:sBhEd4 +PM ֚x[V]P;t c(ߨS מ9.D[e >P!zBg SG]UD]‰Jϣg f=S;IۡSn˄.(2ӳ6z1oPOtv%9=ɏ "{F8Ձn- v#q䳯]=;K`_ѱ2DdFjːPì$zu W.hbڦ:.rsm>k UhyEbi0ćԇnO̹wF=LdsO:JPeTb&txmLVm^> (5Gu|ֺ9O?F5|pP^7}7s--MR* o,ݕ^(#X0\^rfW)=eAeh/҆F3,UԊ +bx<\ܶv(҂rm53/PDÁætJ;w"WyaaI7 !¦,=_$chsC02=G? … >e㵤cmY. 7D n[ewv ,th-I31Mp-sd3$yU*<|^ h0849!s)Xl^kjz+C4pWØ&%j\q7K;9in+!t6ZlxuxwW@y Hڻ Xkďլl6^;^OqU4Ԙ&7LqsE9#ǒgM7K~q$,|+HL[A19g(7q8j/ ¬պ!&d`HҨK\*{_h; W yIs.Q?>-f? `շA4D4].ioj5_~/h4]: ~ $9g/d8&rrxv qt`D Z5dY`E!2& 3|ѩx %\cO q [8ٔD4 #+^!eFk`H)tVP\|(n:yDl!Bŕ2dL _h|RF83Z5(FL~= Meq!Ծw:J͓I|6i8`pKt rh6dG_ .0N=mvf>(]ڼpN _N0Tqv sb<Eջ7P<*"\8%"U݀o!\wYB|_>Gω4x#р+^~2Q 8SPe T\b~\SITLl{d{J,8*ț^ ;S-kB;G 7Y9v{1<QӃQyj?Q1Ik uڏ='i0A1hT|g(I7H.\Y'S fjg2B:D%\J B8Qd2&CzOiBA`9?Ñy "oja,Wj%6FoF8-^vl@U {_)55yJH:r- !y-|o4vuVHvaNz.T-eqB xe}Ep=}+zP# j4^/" >,ibgo) a`S|G&Diuޘx^g}k*zX L1U$[-ւaFHqQlb}(*wFw6#_RSeJ.%e$8 v(ι 1tP'5Hp59bKI`șLtt~XG6~*"s2ujgݯpV^Oi,;E~367uݦt)2F_HKiH8J>kؙVqa8O|U@EP)bBg6C=XqjR8#gEih#("2ʎ}˅RZI/ |=ÓA C-!J lSxJ ?Mq%'ǚ`%ƀcD1lTg6K\g/'}8YXn;/ /;-s s +eR8⨒,ZC,F:Rg譵RX%>MW_ |V* Hk%WU* 3!e ⱤЈHZ2#$ M`VR==gQ< $ ݵh.7mp)1hL1JH^Ջ)—1IV--ȹȮJ^`3[$dRs$&#PRfj?ZM4o4vrO8k5o/HZ]QܵZ}|{[:϶0)&3XVc#}jSH0M˷ҺS"}-WoH,X^ekEDF1<#NRVKydj;lF$DO#%E"KblBƇX,Z2PL. [bg@wQ4^CyְLbNXH$N4q"`yn .8J 3RdzpX$ba,5_"A3C֎Ch23r:Y1'tӠa!_pKC_ݱ:Sd1ҹșHʲ,K#D2\eI |Ru}RxuС[N0`JQY,TB6i20gSqLw YEy2" K2Zl?XN2k4ZkshCw_qʎqQR۵Y\C理o^GÏF+fjɺߜNTabWLٜ`pS6If9R) A`M'(2 i1{YƻPj-ۛ(-JYf zw[̘`g`!xaR۟TpFwɈ% "3icO+_7:rU<;k ZEm?/7.X2=K~8NVe%՜弶nYv3_C$9h2; b0/s[!N^!SrCLD^-Hӈ'V&HO~(~CJc]cTK$ !aKq\Ğ$;6t֢ #``HdҌ!Idlm+=۹/O 2O+x }"D Hְ6z-b5jR!+dt.)Ҩp,t77*/eVh2<[Ƒiz[UqT 25Yh.U)#wۮvz&Ab2dՖ9pW'Rڔeΐ&E<ɟ'K82K{/[)sF8ZqRL&wr*qxFT33QTըltBF`#H'Oywn<ʯ3iLۮMȀm #Na~" -Ԅl&]T(JTnԱI/}ZM8baTP/GS@Dے kV;X:%Vȸ?`V%ݺ?76rMmZ(-.U@tįȫx5m+du[v@b%wmU܄/׺O!CMoÉD7;hU{ݩ礘{,ǡ+q͞J=wՈ@9Ƌou?ԠJzUn5:!y2քnj ߽z~0d'N;D6va;܍5"srz@w"B&\䕻yvGd7ӭVa.4XQg!?^qn(aX,#y1F& VE쥶E ~~ME BJo0RRxr Z-s00cOk~ b8lΒiEܒ3X}(]@h/'X`b*8b #Z9#/$>A1?&/qk Mm 0,a[iPXśkDzz>X>Wb>Ki%7? t46lV3XiJa}e9u1ꗵ_iLa:c`44Øl]q?lpQIԕ[.A{NIjLoB]\}?Smnkk2– Ovb; @2zprIqve;BaFK<fr)K<(}[-<TY&$;jyMjqIZCCN˶dUcڐbrb3ܪ#cPaLs}. V˵i&eNz6adW14A8oAh\!(\m4| 6/wF5~ 4烎‡)r "iZMVX4dU &E<ym,AiE)]rN~ Im"I(xn FT1p;#w;9&pȳ].MqًFȨTɗ"G;ۉv 1 >پ7v6GR8k_߿{r½-G(VQZ70 q&p`ZEci92?_6|8G+>@rz2qX%2!Wgmeٻ6rlW YM7b&݋.f.;blu$I$%J"Uɔ^Gw$' ,Y!k4Dnsjrv~t09(2 xB9GJHv*4AnwJ5.pDz33U3KV€\;F׎)`bTP +c!׊1h:za*ŻZK[k;W`q[%*t3jd`phڣŦWE% 8I|p<%RY =+,|Yn?~|G"|-&\STW`//ݔ:IO(HW[G3+)K߲;( sOs7 [ Y&d!fgjw#'S(Ơ)2<]+]/i~!U:|ʹȴG-&_D яN1 tBl8e"%5 Y7KÍk$<&qJeg˨u΢wTsF,yE11n!8F1c@\I~#/D)bcr KIrisJ,Rb fZ`-?<䑇Y$+dD' [) )^VwYٓ7EqqmOF T¦J5=>V^vB7<.) , yNCd!$*tH+IHD3Rj?_Dܤ2aT2&YKy ^Uq&'ϱ.]\Djq4I 5t:@a>yah'JÜvʭpUpg䈟fT8'D|B4QdhLa3ؚTĒۢ<Ț$Tƭ&tXWh͚eB_rH9f! Ƞ<0=F[wGNVN(-qWp p彙> hIN6!A}dqx.|{6rѓy2t\ є #0s( oD»"M3tTp]}퉹C3([ f ]V0@ɜ#XRN\f{%df@ͤs w Guq09rC#bW\?/;M'q`#.{|dԣ%oF-pepM#Xj!LIJ|q*cSP6y'I6c K..Ki k Iebk[/9Xp!X0)23 $z #X?lG;H?zo394ÐwNG,_2ŕ;nR/Q#%(鬦:'^wИvֲf4'7Q&|X{[#`>3uWƂce>ogQ*XʾQqaŰ DJ&)ʱ!pz? d-cse G%'l2g,Od"0Y8o{(R"擁mlr돖| wؿX ~xfQ>F]v' u0>&pPyU$ V"W藟cF>8dOѿGod)2Wx'U\Wbgp׻˚n}_)M|ZvY.^j1"ߣU( \ưѱ {*r*8Z.d!r|bQ!LHų71R(`QU*u&C3d9IEķ3"qUzlds` /.fz?e 0yN$-6Y<]zvrM\*yi5cs0ZymnFuq> Z")pK&{gMXgN52Bk!tyM\1%j+cחܩbkJ=ަ?fINA:MK3c)lPF h.(,ofDVf"5$X$` S7{50+WH9qwDT3@NGM0_P'_ȱ9B&3= ]]AlUղ{L۩&45A gZQ\dQ4㘺%qz\h>X޹kmQwZT9@"&cȵ-*P[k *V/hV]-T {Eh"fgl2=:rjrZ##ŃY p[pHN0d:tGX{]9r~shu=oc:KN䋻ƶX֝7}&͠b+K3`4}DŽާ52E.ۣjm;2XI^b'8\Ӓ`1j/kԼ>Xc廴HݶgGTVE^z%G81K4lN֌sUMW"EkesA#P<&$Zكx sbөq#A>L$v1&f)XM &D,Fñfe})I>5Gb<}!.2k,;`7וKN=եv%?22нN52Bfr机>;q- _zU#N_hLPdqJ Uayձu;hF3FEg5:56)3 :?Dӯ% pJAcE ڦR.''q;,E:ˮ9X{lM gOdl7A]rm28ZJ VH r0 >Ub"=R4*s"Z)3)4tYc3>]Һ~*`rZ=Wo1D3 C:;Cf,@r1Ab|c< |+520]˩KPi )Ѳr۽!+T{}ZCʺ>Dywd1l`n4Xw l0" *b Xg _ї?>G_~H\/sG`3?}~+D?E坽HXϫ=;KH.|'2)ڛ(Y/fU˴^{<:EcXGͱU FCL$V@t@IAή wTPU`9b K:ՙ];[ǜvbHHNn8 #o Z(go΂Z iw]6̖&/~T*Nn<܊A5(%ʽQI-.XKp8!(MA6ܻx^a1"-HD,F(pѼ-\^o4zrxoäe4nl&a3cYá[1imxd jPF6>B)r9[7dpnU]\zf\O0rӯs#Ōsxn%DN[|4x ]86VG'vaHX 8ON޾ap|~֝ <՞+1`7^ieͨ鲘we\ޅTViJM5 J:H,Gb4 \DVR94BKɑ~[̒Dfv4]|/r \rzQ)DZ'io R2,,NND|ew=D{%3$,pr[m.sE66{7?#=tSg0/peLC:*' Bʼnb힍E-V"XtDMœ'=OmPϟukRFDz.Rıg"խO߯ KlіwVF r!e ?&t25a(v!q]}Hu0*W_*`sM[@I.Ϟ {IzN+v W4)N^J\R S[zkQW}wlH"pr?&fu9 xd% -93qScNEdR\#<ȰVp .m7߄l=0yXvH{nKooD ),b% Mڂ \wojLdIxY׎ma[̿V2MlU9|PBcN.ƅ>xcJ;pW# A5nibp s( z2Vto\UOgSmmQ!-} O[RO|,Mئ X5(O*.۶_2z>Glff%3NA{ԛρ!Lߞ<];~0͒Eҩ'iOVQJo+='l`Zʤ(.VIc?4>D_ ;PzWa(saٻF$+D?nb1maayբHdKERdIY*JmN1#32.'" \#?lR^ʰ Z _&7] X*og}ǎ-ǥl1wl0ǷEI)]W#/BIand }x/3o0;g>|imm{<:IHp/!&*q8 ! F_y1)g"5D.%xǣabܪA'[n椳!+uY$_lKDq(Z u (ȸU:b"֣׋>kÿ/Tne vt˃5GO1BUê LIm3A"[;ST>\1EuZqR?.JSTb9Կѩp(5M/&,F|cԺ\SQ.d4&Lv+ ,|~R&(LpJ0r&܁s@f&_rYV1]^;DŁDƯoMD6GzPSd$ ^ pMPFEz|s>,w4Aj.6ü#Y/#|b!3 $$<*u(aODPCbtY*rxwFu9Vt=LbbNOWI1v` Y86):V9AzɝC35a`uiG]Q;4Jryd/%brs!tJTi1#E(9#U`{! * p8#EYf QuXZfftRQuKİnIN$JѤfC+Xi'hٴLi*ߡxW|vݺY⵨Ye*:G'p(CN(fg fԷ)ggsliwu4.".#럖g+Yp4kccr1cM h <F 9j-<`6"w\U$]6 T']d>QTHZF9FPzmIX9huЛMtA 2wB+!kgzVCvJ2"WQm@;#3c( n#XaMB ]u{_zfզ;'`""P`Dbi!_Hbl)bDPHNυg\+흓Nnh$0JړOxF▸WyYN@,u$JzoOvL]^p FP[ z뤑 k0!㎠ ;69lo Q zG͒s j/7NiP}r];{~"i~,*B$BBÝp*sL:!GE=C-)q t)gct2^.{RL T-Zm]A=^g[Ce. FDb-(EF΀5 q\Fs M/ox䁈HNojК\Qh;Օvp2hTFJ3ZCeE*0>^LJ󉿈AΘuze6:CLwG8?Wn8BǮ,:3xG|bt`Q(PÉ>Z|#.ϗ_$* UU P#;Lx+jUs;Ԇ8pkȊuV`|!!_~KysWU`|,Y gw(rW[D4m& C:v=c#Ƈ2 "۩b=3Q{jy85y˓ɞ[,ƃ:(9Tu!d .pWl>L5PY ct2zﳻ֚?L֧ `'R$yn.C"‚k*F,-˼+‡MV O:OgX;YF)~߮tMU-77_&=a/.hܔmk[`FN+w\-˧ҒIA:⤐;} |b)G2GHj|88ݸPl@KbֳqrZTƱ8۵L hrOozLPREm{;=o1֧qĖz"L‹Da(%}ʷU& ӴH[H[|i;7}*`ֶ%;' (2%'ӂ_cd:_O e?.iX,I & ӟ ޝ1v춟MdҴ[JJ9H }^\}XٶS\8]`X}3vӺ8i~\X@9{p9{%i{co, B:VSI{;֙rbQb'A }ウ u\rfHY`Xin" /. K;0BM/ o]1S4PSs PMdcnj>]dMѭ%}{DԬ ő aTYU+~- L'l\%NpQYCW_`}cS ݃(W@ n:\緿 ,|"y>:Wl[ F~rJ+/ۼãM儐#ΨYM2:j,|fqI EI=x͋ D~a X0;;>|VuR; :%87b"x^ZJ8Ht 5TXGU0Ӄٮ€gUa<ٱvur3~gE9PJj^JXVH\~޶d380W]3H'f) %EZ6J'H9Vir)RᙹzRn{-PO޵,"0R ۂJs!-*0ыRfs!ypHFVU$QRϢ5'ZvlNu;nl3Aͻe߮xe\rʵhPhDZFT.ہƓ@HRx0&h(o?Lv#FP4h"g77|A 5#Bse)h5j{f V흖ԅOLyt-_փo, I,E1!޺DzeD5:ʗoNuzIDUh\c؍A0m ӔktH9 M1-tI '|]6P]iG3w{+˴ ܝ $-PZt?_3}n64] ܓO@`yk1yl䴔 y*Sr哸U>z `}UX"A"EQ B\6+QQ9p,xavr99~H7_hʼn)f7vNY좮न1K$/M*_a7by48O4JWW keqX'@zTkh2TeaydRﲃiLc; W;#^e\b[=PyOkk` PFE U"e̚H)1ێYFa{BQO(ڭ f}Lȏ#9j3J`}=4oƟ#j;VdB!R1Mox^Dž'(̹@3> kVڴ]xy30GDRB!?Jn9@&QHDK%H[5p z^z`!w X 2z<Ў2]|+0kK SqƗ[ZFE(k]odhȐn4' PrPXގ( ޙ>kRJf(Ǥ R%YѵĝZ#)AAܵ$be$V2F= S@J ʀ$mD&$EmZQג~J ٵ2S 4YW{jۥ4â{1䵤[YjEfS8 r]3ÇwmG0+hҫ˳c64Wt?g #:Ì~-8&=6%.}ph+Z |O`'K! &5@ǁkT{SH&8Eȥ*Lch֨>PvIYYRb1J3SNJֹX}&o}ȐrtlU1 #@)yl%W, c.EK8N@O.܎,_ƴBHٕwX*r8='&ΗC9%.ٝOմ3}G<(Sv޹մzBURޛσ3݈*>-u{;u$N{Ppx&!+aA} 6 .pkSki]Zdk\7.qw8߷yKx4?˞R{3-jQ6s+ }=WXyw:{ϵ<_`zr'_|"2T/>ᐚR B:wp?Sjr*j2#zrc|1'}՞ֵrM)ܶ]4(٠F"U7`ޡU C])7j R)$"ACA* w=_<̢k&=}GUn16Ck= Wow y/뇏|pE?)MVNA#;7"q0 5WPhU24(^*Ԡ:+h(DLQ_v&+KPQlCe,Z#ɦaLԶ7f#+_Ng։(t0Jd2qoyk=~[ܟ|4v@#Q(Iy0⢈Z8>]BWl$[uz>~C)C]VWךF[*RR&*,?m->wYe-LJeC6Z7J >)FB+?ZRH(D٩%dQ@/%1Ot1* exݨy7nQXF^:~wHg[e-1MzFma.Iz%oҴm%m&(lvX}>Xyw٦e툧yf}6Zftj!^" }Ƚx!tN^mi şX,^3u6bo]5Lg(>y(1t]ů9)ܰ XFTd~Ԯmn50!z1Wc<(TV@zoi_k߼]72i6:L'>>6(2oZ*|;PD'@" ׷ᚔL0:x25ys!]8ubw\X"}O2KP(yH}!vQUhPݵhG@γȞf@W3+Jъ(jHځ)S\L5jA)8{[q9h&ZSUN]X;4J;h"^JvΗ1D$*y8O<2 $3<&RQaXH8OD'־9SE2wº#A-x"_5}jiǒWi/Vw[7)RP©}jo>e?C19 L0NA,N C˪A}2 oK&}pbcy= z5e5xM +{d~ WVL0s>~L'cŪw& ݠzXT E/ǧ8D| Sh;aeBWLc&u&-`o)Qy+jD|5'@l/:O??Kt~8VkOSwovgڢ?]8wş7Ƽ[Yx>>/Χ+!89l+D`*wݻL`q㯼VDq+0Yr^ pD|T*$dxNLJY^b5Lza;Шⱃ?u̼[Eѧ/@aGa= VQ9:Rx#Yd}M qYK,!RJ1z]f_|k<mcc* S{?ffp5Extkb3">΋? {kv\dM^W=41u"~wSy$pG(/ 8_ԂY>ͦаP|p K ig}|kިoj':}ưzs}c&QiY ^r)Xc&"lh־&FeўٶNtѵM^;@}]Dx89I+=-%NԤz=rA!Ǎڛq(3{qpRb( x$3B3R, @3^Omdѿe꒧7{?ItvVŹ.%>!}*r[Z[ǿ2]a8E +f`o}=8uZgZǯܝa{ל\ =`C61'cOܷ1'?ߓ0qD<a7Kf.7YyU'-ZkȮe6-*|FPmwh9H)iCכ7 8 R^cߠ o^ǯhRP7C+R4pɾBd hTC,@i d R%"eNA!WT\4d uwiTPI63D1wj#aLdu(vb%WR@`%s |SE@)%gU:Yfmc(ř[g%Ĉ $fjE VDz*L 6%@T l4 [yv!'rq>'uDx_NW/F1<_)Yc:Pr(D' MNAzCĚbTHLom;~gPِ(sށg?tft+wZ yO#\q+(uߪh9uQ[b(MJ)] |ET_V]ftlA 3Wc!q+Asf\ӏÓҼ:B<#hx;<K-^BfEQ9\ engvHM)/qK~] {?!"G93|#fw5Ք)FdUWWFFFZ1vy@%Adqp4EV0ب 1`: b՞fUaGghXt6vU,M&8O]KN`/dlRTJ|VZC5Ƽ8TvVZ>KN+e-f .BVðTc qBWJJ33TNj:*H ]IoKQTs,%~} q:tOa' } p%D9rTܾ?A)jVS9d+ 'dgЊpMi^.ۊ%k"5M04lPNZ]O7҈^5ќ>tN]pykuGO83Ed]9_ }q6t#uX&]T>[Pxc'5`8%dG֫2vH$%bآ:(\n{Z'3Ax /Rpkm[W1pى̈p#HhxP@jb1_ׯ: (ՙP:<-;c7צ(B$VBߪRwsVeG`ZN YCQ:t?~dr|@հM(,:}l4(Sf 1XuuR~ң:DE(gDYS) L!,($9GE[ 'ksC%{])(I}Au;.dcEHh++t: * EfjjG{+N#kYl(<;)*m򩶷#P Gj[B!xI;Nx.̅~ܿto\,xRI+fu獽xl H+k!$S{fVw%y/Bq &2I=aIoCqzXKA+8 +aD9n4^yeUg!D<W2ߕ(.Z8T"v@9(dT>W0I)%X;/V@,1ql+*6ߛm!?].ܨN@cs͊StR2sdD>R|UeLo&B±S`쬜( A)7}Ư?ϰr:?8Z?^GAm6c %!78(Wr?*illWa/M'T_D39;@=Ԥb'U5$^0=6݁l@6ow f؛}qJ߀g@jޮ,-ШN.%4z>!b (v9WF'[d7 gb7Al9L5KO$,mŜ4uwB 8Ŋ\(jFK&97K)(U<){C_4 +8QS2qA{<~7o)/]Oki2bF VOQT3(ʗ&YyTq7t~~WGpN#5x`ډED1FXJppE=iVӁ 3*v_[EH1B_m?Ⱦ1=Khw]$ꈃ^lΉh@L_Fr/Wt΢ߵ{v{=F!:U*p '(њ<#,5C Ip^ IhqӡPpB0eߐ7<~$`g6 mB@'KUu:0.M7̜TX Dފ^]UXO|BiU9; On06|/-Σ _Lgv]C!“*Ռ&FGk}˾+P# j=\7P.U}&?^FX R<ְ;~B=ԇV$T5Z' }UTr\9KauP9#}hȍj=ɪ 霔w0pYi~qp(Tb;t!Y-px4Wk@fH_?nOlcvM}hUݯ=DW%OU$_zk!ylV[X, /mUb;%99ػ+.TFvYFe}``PHs,}ݳG 3Zs0[v{y߭2Yo4 Le+Z+eu upȸ䗐}CVg&uJp2چ?RU>[(toGUh_3nDwqu`Gp!b.7# (cPoZ/QɃ}LVo}q->0a) T|Ѝ:;j!*juβB'*kv%J@E!v&J$J눔(3c$эӝ ').ZfT' G~G7X:Ղ1Cl>|粱u}b+HU]6E?J1Dv{u)\DjwLR[0jתb(⒝ÈkOԻz7*Q9lHb7cFx[( Q{26i%!EpTj3Sbe*#ݺ&PY<#FZ8ފto}:] dмJ;kgjo]yIH6RHv^rAXT bpfg1yeV z- ;:GuolsXk.\'ΫIps'j^#Ӌ;nF A _l&RԦaUl`|L53@o1ws=FT&:Y:ظE!4ARϒp|A1ʬDoG% ]L'0d,t>7 %bñYOX@褧0 ˓✘2 )QoPePo7ogR0:vF6iʕ9P֨z,ڛ M*IhV%(=S$G!sOپEuMzb:Dے!2Ȗ=%ciXzP8ZcXoμVw>3ޣ}f ivd&K rCČ{%@ 9GJo :OhW쑛(E]4EKlw=k77]oO_+WqIg]*No߼q}[bϟtWͫoǫryw%q$ |5Jʌ%#RO6`|0w3!בF"G5 ;셪"9YfNVEFF|.FưJx'ףܳrR= J$3%ǝOh3-3늎=`Z_ޓ}p$u{9r4Q3Ux͑a II\s^PtN^ii5z3ACZp{]x#7ȸ% fiu.sAޠ{Aw NMЛ@L\-Z::^;fmGEt,KA0qa_Ya׹zqӗ|V?mf)Sc]b3҆媊%U rCQBdFi W]Nbb 8]b4z}R3c.^ϔٻ`t lRfvI''My*GFNɬh̡dUP ?p)d(A[t*'} Qed?%^1nNϋ5xT]'5?dgm `FR}J Wu/82VSŘ9yeo鳥l2v4ݛu@)pvZըrk%RIYxVQ'DT=pzb `'i} !)%6(WU!*CezSӤfF&rFjwg$˅H!\O4b)ƝssF{LNAG~h<ʎ@7;0f= ==c6\41 DWI ^r77#9uKY zWI(Mw01ũ-;eMr.G&C䗔Bj:(nu#)//7zhç_OfHEsR/S|MwĊlZU6gn2L D- nkY3s) KtH;xWۘ/tLCƒg2dSpڂFO2So ) `DX D%r]aKD &7)GebgvV?EoЍ(N8E.UCAGjdǹCtF7LY_H<`Y{-m10]Df<]Ys5N{l錮ECb c)% SRHZ6`6t[B`{Q)h'TRo-51gQͦYJ =K߷/? o&#O}*c)OӔJN[WgQTk5]9B6Sϕ mS)yo*{`IL.$jgc_ߦL h߷=VA%FK4ϻΩ]NLDe2lvOAR"׫ς`l،5;Pb*PsO7q(gBp+M4{Bf&eݐqgK,ic)>լUpȡ\nP٤T+8QsU]L2djŭ51]Z-kHH:4% @IyK^cj v3CIw:ۺ.~0"p\1ukcטT}LZ)޼|vabT7l=30. "wQA~fɳAnOWMEX$|K=gt ox)/z=vr]/y>7=j>7gt#>}W.#^ׯܨ4(#W"^ȣL&{`cdk.lV6r3&T=V.F1848C!3cw_@?E$P~Pk΀L/ M֋i;{Aw5翼q)o aDJxI6G >5wWaD $@k$oo9e|#d̔!*S/yhT#0NҐhti $)Y%%j JimLvEIbsoϭ2Q4E 瞦p-/e1ѬdUΨiIȘ.NhbD3YS3d#HGuo e┮T\Bfa!m)O pCn=OL}Dd2 Y.XTW;{p2 k [uc0#DnxL%4^7stߓlvVX% o۽|.zdsDzPԭ>|m8F=@2fw̜M #Мj^ 4]J+\5;LFKڟ >Q%7DM9RlGާ6z0>8+sej[>4fsNC}!/+)R_WV߽BZqxbJWapiohh7x> ߦ.[ǿ;aӯk~2S_=">(E}-P.ܰJMTG\Q% vSnL9v>xp't]z`bc-u{>x|6%& ȶvGFpk疐{Cȁ0>IOكއ9tpGޓ<&w@t@2XCI=z6]zH3bb#ą Cs'+Ez͞5=o<&Y"o|+GkB ,D8]wLNq^]7vk #^7*Q"bwbwV~(F=*"̎q 9fŶ3G ^/ƄՍ}On\c/ {cG"+0+[a kV84J91݃`T1/iEv+ەoeEv?:C&Ց`tQ+PM"5ev>b4^q@c\&"9?nv{hЎY=ᕳ7&qG|^CdM=,c|j(r>d|#)q ؍B.&CgithB[ƚkM\{Ol;JoI 0RjF!QhN~@G۵ЉC'x5iT}rDZ/~ڒSp$1'tB3 TU̳T$=&Ro)gc_R J)K]_& qk9>yeeF"ِW/U;1keL]Ym 9 =>Ά!rVjA #5R[ifLTS"PzȞץJFWͿyEђs0ۧ?‡;`uSLu)وg`5{r6kə#mv ZLul6vp/bPȍN5)9=mI&&Sd7&cY̊kVoLAJ tL]m1;+evLNz:E?6Y&۠Bұ'{fl7Ր+m$GHG(=؇݃}4 )(uߠ,i+e˖*I$#ZX%[+iS3Y0 g,T |1ൖda ce4 tJɚyɹ[f%AVuk|r1 Jnr1_*>^|nJѢ+&R(Dd %hCL I@"U%iQ]P ɞp+$Oz1)u 哽ˇm`j1`I=T2<͇pnoFLmN}f.Kj+>hl1Mꪝ<`=$n`) }C\*g)bedi[BOїJ!$d 4 Sa"lA1S1;ߢ ; Eb 4WgB[S7uh@kӐCl LP9L(4 OLED*[~7Ƅ]Q&4:Is է|}9)6@=??HDg Eec. .|zp"3,1%Z] Xi1q%Q+Qʯ-@nFOw^+R-nû;(wFKJmS-bF۟1aHnyk{)r<;=盚1h$<a]R+FS6D|0-V;vdvқ;Ą=QbcJrBY' s/'I000AfS4l{ZH`_ږzQ2p[şȖ:H# yݨX* gY'Ƨ uq-mRLJBE_rOeLÈ0lܰ-]bCH-r OTfmX^A$m)9=jfjmtePʓ2Q9ךͦ|4YC p=QB-v|In96z~w u3Ǽ<ܔewAQ[/鈱/g*a0Ķ @jvPΓ0m"\H]AA7o&TNyЌ= (6tRftWNHF -hYZKƷ>{A`):ƎGA=]|;\4mhMk:F㋽qy‘1EoVLda`m21H҉*%VDbʆ<ʷQ/f-L3*_ |k%zmTiqTs%17 >jӺ@4 4δnjR`^MxvNX5+O&Pɟ xĮ'1J%YL=$;EwXq;*}H]3FýH+)MO˲ 캑zRڰ +*yj󎚥R{ `ZM%7@n-=@}orhu{R~Ԩ,-;:IBjܺs!弄N+Vtk؅~[D&{RL;=S.1h\= `fAv:W$h#ޘ$r1Ȃ뛤 {i#1Dth5:#`</ɩ؎%GB 6j:7M=b~(ҏc j&{]ˈCn %]{-C5Tݩ;RŸXmQ1`ꩿaF*d3K0b(k@vY;ͦCKT,Ie&~Qo' ;ܾwk;q\u\2MeAKNF%2L[Fm2%*UyȘpW+M<,j`.Q%AHS ՘Gy%sJk$XYe(G4kd4} t;mY,gj\T9gvYȬJJ;ޯd&{dJ)3xb) { ҌP9|Wjw2{.쏆0];WV` 9<].h?Jtlpٔ:yCؼhe֪Q"B8B}DY;Bس[Q7^Ȧ@9W7fRq,!-34Z+%d{ E [ 2bd+XUզOKk\uĠHG^%F-wdo= :BՄRu細JԞ?$OڶIjs5_U9aTw[oR,'\%4(TwU}3u"VG-k&S>G d9K\kE+Mf"om<(X/۾Jn`&շW@JI752hgMEXMΤr': ♪x ORBB&CWh2e3'A۬ЮB`ڡO#{KЎ+VcuYЮBզOIm4Tht8wX.>5=|uL3i3&@M Tpt+6$}'Y҃^?4𷏛Pm-y2zjy#G~ֱZ:{VR0#ߦ\f]bk T"/ztҩ?9%ǫ8jPgJ&g32UZ&"*o23o蟥ŗclIIۙ!pV|ܹ]e?gk}vI5;v/Z΃p@c=D ZlK SE3b }87e-Y]t)o/q~oڼ"@F9[ Dj剬~[^Kѹ٩}нYqWwiX{qw;r,%ȮymH!%ָ{?bHI)0+fIwީ,)Et0J|L~i6KxAҽr08"WWǓpG$r|aM-ƎqS~d;(JX[͠gvQp_Bu?|\ۉa9!y^]-׹nu<W=ijjN ZV vuot@4sܠo eakR/ʓQ!gЗ,8gf/S}[s&BM3?_i9ŵυdM&cm/Oϙ5ubCg3γ0"cN:óOMU>Y + {֙@D(m,1X_;&N_ Y>YO%s9q ּk!lς{$XŮ%E@,p{$Hp Ӓr#/7"2h t:Ygޮְ={YEUE]؂d]|zȔ\FpxyJq ,|w;Ͻ(X7QhS 6 0 _&m)ni.WE*j@S>l-+ h3.- z̻]!Ś6c{A:Lsvڍ>f.x~gktτj@Q)A3k{|gݠ "hO|lz9&ΜOR^jK {冿̹G8|tyiaeD,L9O;%aUʈl9oF/ߞ9G0~-`pu>^N^JYW8f8z|cd&)mS1|/ZJ8d } V9gN{v,E`$|o"L=|\kd`d&#$,dYAKGc6)iVʭ“W@$ >lUyK-69bzd8 aD 2{*KdbN wj9BR]`MP (T&>_ǼA 8=2 R*yRPF](DNl3۠E-Z(~jYzA[͏% v[u%e\W  &\Bmj%+ۍ. ǯ>L#]9p-CǪlϥ-QXI-\fwjфKl*u;Xi|fd1bMWkUyVUm @"z `*ٞs" mǙy$2ydl8OJg)u%5bK\Q!أ-cHUey^R7Kd ovS]?\-ս^4A^OQEoF"FL|Óg"Ks µk@<<t#]K3q`2!z0Et0WpP 0G3iWHqd! 鱟8jfz$FI𦃯j)aӕXLyqy-xksЮ}8QAh sN0Vڏui19B n$zcOɨH .t9KCiqӯ| AR3c۬A4ƹjs2]!TֽL:xF y$LsUGW><,.ʮ:E`m o:ţi#lkע JvyK`^eJE$|6#o9 DVKJ$ 5矓N.pV u+ $Up=ywwk4xĪjPŅ,Ng$K^R,WuH$3Dt[ LT"t9ùUblF:˷fk(5j Vm Vtݨ`(l; ^5YyyΤr2@w/LA#Gv$#JXHXBKi L#"wQN-|{Xc|`I 0p?ǯ 0{ H5eJ[v헽 9?O.GH1O5f|$D!BmYG ND5B@DZF!:  '`ħDLh0rIyI.<\T w)6%il6B!S˂y" ):\u˭[Dk@o fމҨWHS4xRYvK<at2$ bn-;PjGJC 244cS\5 *m*hatE˥ DtΨ<#yڻ c*TAfel7'AX+,^ Mى'}AY 08R)ɧ\8R#vn{f4ɾY aܮNba8|1*V{]n\>7 #NIccE0X$LG2pĖtQ٠tk6 Znqq*h& gаi֞FkmѴpv燢ATf A(vzQ: 0?LP׻[kuN ۃ `#gw:XX3ΆOJVXt` k,by]l5-$> T>)O:_4լ3!J&6g(67Ӊf^webS\^s4ѭݡSZe}~uZ+pga .`Y#w@Ԍ5w0fH D#{;SCKLڱ7pE/>ns/ݾH=> "\רowʴ{ef^2Ž_R7Z=E{KqJzkJ krJ5.Ljgts}X^ݍb^Tgf50_>GIU]4j~_8.s{0UȔ4c*GA,䩛~ے$nT95ٸ_3u֧a-\ w0J";^Nv#*^` b<$ NQq)#l؎I"k]y.\7BB¥QJ4QYa9.H`_J\cI&(b!XkEޕ:t5U^8DHA Љ]{<?ʤ,A(KXMwST5z@ (ůG|f"2eH 1If5AIJLttD\SuSq$hsY.q:5/}k\.ٱ+p+BX;b&2ih#To.B=#$`Y;7V&׊q'3#Xeslݘ7I_٬&# ـl`Y? &;&݇. vr\`7N_`7p#nsj >s,F( xNPRT#5o;(ghVEmL~ISx<'g),՛XU$O(85'gA+uO90۶p=TsW5*w^*8I_6>;}q^}QD\w &sZap1wPCj:qN%; eA|SN*c,F"z|q z*: !YoR^ "b":]s vQ^k˸Lb`iҟ淬2yoGn^wŒpZQ1rB0GE6M2PɅ)P112ɜb.@¶:}e/S܃P1m}4k0$Pd e,ERhLB|4 KFFoh5CÃ4(@o&WRIΓRȪ`DrcB!]SdCcۋ{> (Ҁ}8;!Vb3Rf~Dv! GP? NÀ iEkgzݐi! }!!ԬgCԽ2 eJW K4Ü[]wre ]9L0ܥ7pvBO]:>8%7Oc*00vb[x#Ii Z3u pUsP;h` ,!AB[%^ZUqq-N^kvYK0V"b"[$Y25b 1>'(^eHz7/KK˳f1-NU.N̈́BZ\KW!svW\^Lj>wٔlG>%;mZVO${J2ח6eQN3V|v+пj[o.ƘO~:?cxk@NDxjA+E TFǚH_]VhpKZ"[ģol)&UIK@''`jݰ6]սh@3/;7oWˋE,^.ƀNDWM䜹i93K:}B~-=hF)"gN6)&M`Xm-ȫC Ba3X2kuBϔ*Y &Դ 5.0F&vw9>Q[W0۩g%׵>nq+[}zhJ<SҌ!@a1Y;hssFIF6cuz.I{a㤽%՝Wi?;gH3qyCҾw=}9\SYXx3UD'Q:>@ : Ҡ W& Cq֍Xi"UH3 mW[u7ߏPm 曰(-$WOO㎸/-a\,g( |5|o:.ek'A?#M,~7c/+|oS, n;~UY\\an'in-AL{ݴַ}tş63KoV7c^øiL"Tv-` Vt>ݪvՈn#14ʈԭx5b:LH>N2"hͺ;:]wĻ*lz]ьmQ8]w0UQe+|X&C_h~ľ» ]n= AwLka7SleW7fql=YV.Μםrredor&96k4X)Đ 5\:XyƐOwG?ݦFk^j9޾`fq*7҉?=ZGwX:Y|\Җ}q1 [Ak6Y}Xk[A4*RhξqmS!x&z"*G~(GhQwΤDz»* 폽ق5}3ɵ[u+n֭tu+u3 M?lEftV![ *!'PFb4Qc_3/H=O E, \h+4 k԰!JJ٢tQ.2ٶֈŢ .2 F#,>AgRQg)A0Q̾&a#en΅ZGkV$5&q1tF3_Rɍؒ28d{+=%iZ)6_/ T1ۮFS&G:hd'DB"5'bҁM(~g0F%Q//`ӑs\\E6Ko;c D9CfWYPcE^`LY'6ۨ Bw[мA}JUHњkJZ QymxmPTA "5p%.>~l; F1;`$8W>"۪UQ.B1K{1!J) ,!Q=lڛֻQ9,%afDr1ZƱdU|SR  W10hD*gh+m( NyN~ĚjՎi,Ѱlq aSaВϵYP~!I'*Qt3]ԹgLV_eU[AeYV0dE2ĝ噡wz Q(p#cL^0}^VA>!Tϫ|.5(Yi;͍f5 ^˦؋ 8m[ >ѾI!0܍6@DtCUg-cɓC;nT0CGjwh`6rFwE;]@uh$\po]:3~e4$ؠ4<ӑ9Ei{h,Sb/ᘫ3MKI`[l+I킈Ρd :UC(&cȢ M},$G-,Tfdi5X^76rXl[c̢Ҧ1k\+kW䲊kK|u{n0 B2; in,}/LP u3"uRiSvI(eD0jgLdȇV)5\r:G>,=F#ȗ:>(0l~2-kVɄ2dcPtR^{IfL#l,`F\.C p>>bf4loX&"%kk{fm_RW ,0q[l7 R ވ}iXYr/#wYL"iISAsbO׼<['cȍYvn>GlhK1㳰,̀!V`;X@[@92ɮomzԊ2*m$ 1x1,>PjV-sf rZ;E`=Vڨ'X aJf=&U֡f²RkH8o[l7ɛP{ʂ/H8beH^ Ll(-lP4ە%Z2| :&muKEDxİ0(%͸6Vl % 0z>xa!—̪3)cH9V"0EN6iEFP ׽`d1(_FD6ib2 IT/b1j(@##FhV,=ioGЗ,^zRI6FgH '`jRlJ]})1U~Wf3XgK1"fBLELX<:kZBBC*=F;kSޕct CRZ-/\f)=jR1ƼtI]1Ɣ:j1Z|EG* ;~AI2ݚjNMeϔy_VߩA׭˷;'~Vnf5_pb؅"uRb$);ТRCN+(э,v{ږ=-1}q;c\.y=# )0NI*u$Ryc ɠ !EXL042EoO۲gvE(CDODë;]~@dkyJiɤ-T:A0qT-%N8*1oу r|Š+cLѓƅpD0A*ޞT[) ck$ތ..Wy#a-BXv`mԹRV/$ŪQi`z,(3"wFP_&#L(N\p_U-{cٰZb*['IfXCD޺|a͊g՟*̷][T8(iXCxf1-cH OANyAcO19О4CBKU bȩ/- Y )RJ1Lf!29u껫盿E';G-_4V*$G}*.UeZ0?> d-|2/#:`-w5pNlL~7-C{K& PgЉЂrD\PzC.Iqcg[1pm>|Rz0nAO>4t6rw=1c{F%8LW>WMܷ_NNj:-r<߰TwDFRs"Ly !WGKuvپ_G3ں#Qy`p![o9 !96>7t<zVӛ-MC^l0/_< Jxޑp%-T ()Mvs.DMMY!Pww鷝t>7ʖ"vcQ'>;? !^Etn f wK9_3*6}tArXh,Pk*j ~\#thU/.qpfʀ5Kq%cmCMHR<:=`lűO/F\NSy7g Th?}L3{&:To|M,Z+"+^ƂBF3sއh!9U@,`@H!a8? B(WS=ȪHAtG 48a]__4IhS{^҇O=?9l qNQ& +]܃<~:UO}^ 8Lp]`Z9hƤxf89Sn?]p;LSao_ w5Dž }V1iōPF0>h2$&{;uzB{;Lr?h~# O~rۼ1F~]$@;C# T裾&hQN8wJeϔDR:@2L]P^CT~ 5S?*~(^H<1'q1#e&II`)j]˕ +Q֧Z?;50AрOrۅrAtd;Cu~kqu'(+yƋ'1-#(e0{i/2KB^*ghKg'k?[tZ?Z|6gD1O,V!:lh&ƕwwf+ J YM_y7W՝s9JqXVkNQkD 9ݷJDV4_/? ;_Z9`x+Ph2`Bht1#S$I"(LYb$ƵfdHjI@N*aby*V8Bx_7 ]t>? oƠVts) `"$e9D@ FH_$Ġu :I!b1ABI&_8OQd : ¤jRq4iiJ@.ʲ%1DCk Sy ]noms#mxkta֛]qޒ hAr#o硘Ʈ V7$!\DdJo=jPN;h/δ[0ڭ y"$Sqi77z->*ڭ(BM:nj&$䕋hLVBCE8ZZ"4BVeIJZK,DI\eaN&!"C-dم/sSN#:Aژjf:3<ئ"'7SI֑( ^BVf} \wT{S͡;>׻o@B^ɔd$GڍBK->*ڭGynj&$䕋hLBz ĽS Qީ(Bwj@B^֗)a(u%Ui3>}]3 u~CL!zdtOO Paa5/2˒ݕy-b þ_|u\,"Cq m莋P@RwPҹj>Q$Z j@}Qai ZH-̋}ApcI < F)rVV,ȝ!lFݧ-9EYۆs Okv_.ZTPS=L)jpF=VwE:J"5o!*!H'5B`V#C@;5xgJ1 ZrltB)d\3L$,_drPF%ŸLӝIH_~ÏGoT"dG5D|*b5)<\is?D~u(l '3c'jB4,Q4>(4~ˏJ$͆tE撛!VrMǑ'xVO~?ly- ΧZ+mιXFS?GF%F)8WP+G^ W7qC4t`.Hw\ Ӝ40cs5Atx gYV;\=:X"ܳ=CI J 3Hp-[,lCbԺi1Re@cC5'58%5$2?N(ĝimhM)AdeD* 4wTn]Нv FW5!!\DdJڭ?ڣv 脾v/>s;ߖS- T$Yfu~o%_輊".f wK9_KO3]IgL;<\^e#\gf_ܭrf-iL2b5ҢR)ʹᜑV^*<3/lh97̆ǾLpɻ Cy/#H*󥺔#߷WU(#GvZ"WSJp϶ yM?ܕW*g-[T+_3/iFڿcBk4-ٍBAhO^`jđ3Bv.Q:eۏ[)vh|岵/({yxd5"mMr6ɵ$n\7krG&&22iR"N2nl*kHM"u :1I%Bk t {As1ry/jngidik˨ jv5t~$@hd/gv>>*Vh%VƓ~aR6O85ay]|“HG"$ 9KeI2Tj5di 4 <)7zϓ/ם5!8VzvGfV(. xr@`}]]=IoQ_4EZGw09&l\Jg᧷gӗ#6y7g[:rW,ZBbG79w8\\on:(g[3wT*{&5(ddqhxȐ[7Ȥ7>&h'<hW143 _LZsqoYNk%6J>!.!ތ.mڔ?\9D=ZGbեs7sxDr{-6gj۸_a]*;_T['.)XPBQv[3CQC1/NxU% ~FvD^ 7m~Mt% r>**gI:w_[YJ3y˰wJzDdAS9W*7P¬HƽIS6AHE;߼q,oU2>iMuFI\g,tP)ivVW)+00aγҞ Cn^{]d81KRA gQ] kD%A)eR(6<Ajו~tCki"ɵaҴLVUB; efB-,x ,T@ڤIt؝Ҳ& `)4eQ` E R@MG-ƞ O$RJwpW* dFaDSː*'g)N&fXpԹ4[[ykΕKLY{VqmJE}tScJR[2P[oߣ*d)G8XSDTM `)(نļ1P& ͧ$9l-:^B>e}7/-q:60wS>EewǞ 2a]yi>ݝE#פ|5*ʳ8Gꕽ֦T~Ê "!5YhEL.NJ8i6%)U^JR`k'%3ԋ 01%ijI:Z@Qs%NˊӢzzcXvX~L=|AZCGmvOJ|RXrduQ×KGks!Q>~g냱A"z?Kc v/C׏*{RDEG] C`0`ՠ1PO 4I0E|ߚNў>ºb3PǠI)#&5AVl)ooV2#,5>M\~oO~]#tbʴ֔ȂDpU0!'y^9Cd 64W9/ƕ Bni{P=*Nvȵ*Nx_qQVzMC¯0>kQCCy4ǁGnq/-zZϸc4GjB٬# /Q>8: fGuIc$_rT >8xk !0fp&~o8 ֽ#{Jbն4e3,c^;X h}.?"Ʒz\>2 I.r hQ/G;pD՟CW 4y܀4y9ޑD]ܫnu|!ƀFcIlTOAFkWyo63`uVWG+:KQrU54߶׫Ki%)gl=. Uj*Ty9YfZ9΀W=:G@YfChsJ54$%NeY"`h+]qD\gYybk5q@;v XjMXM N°~pA\y 7vWl٦5PSCZ <54fAhzUj6nnԲ2rPhe pj*v\@[[h +N ]5s&ʹޜ&'ʳL>Fa(qaDp/khw/;5@ɱmƭy`dҷɭqI0*&`ࣟǭlMAJ,):%֐KĤ#-DTrhM!&$W1uĴ 1c%@J*GN^O1Xk( VN8T1ihl#iiLjB"REcR(7D1mMF(VL.vO65%hwvAJԟ]`.M1;7P1OxHI՞;+WVO﫞5c 08BQUv-s zz;C;GU-s~IzXPjRXs~`j.]+Xrv_M1hg#" M2EL;`GBpJv5PTa :vl1ռ|Jjp6pg/rgdlY;#v> O(zOwvw ?dz>.'";P>=y/0gol2C̸]8Gf4Gz6Ǜaz p~s\*ewg=Ձprd5> D~曩7y^|q[;gUQQ>)O~V"_~2$\u}"FZF*j?{BNjϭs,ѝ~Qѷl;s%8lv?FШP\PՓdXd0"Ҕ\Re^+1b )B ,!l`\?F7Hٟβ߶Ѫ(.U};ĵ1_KEه5_ojU/_wBD":SqjݑEQ-q&y5!o LSJ{GT("9gAR* O9тDTL!h[lIPw3APP`j<*~_ހFJ\gHڨRR~\J5#3{{j=rmݞS߿}z5H`Z7!7o+Cow?N`]{h%$+:+kF?! +V{8>Y"gUS}fjNfg+28A|/VbD(Gw_ EL'w>Qi;&qRhD W LT k'G_:w\`J5YꗰM XNwI7/,.m۰5^*dU깊RY-{R0oFAasX[ӚZL! 'Ì^Ԕs9'ŗaXQ{͍6D>fV#4HDp ("Xo%BxycAn+yMQs\-L-g eBaB=9b6"|9-=QLw A!x~Uk1<8ۄy /S$w|I-.w9a<G/bz׳ĆFe?+ogE b,Z ֌Bu{HXi~X,hL"OmRJenW~xe#b \_#]Zר 8w$%]D-o&p֤ԭmC5ObGtC Sβ68*85a[p%نj䏴 肣2Vz%bhwZS18Bt"5%a]HWHR?dC*hwH*ZSTp RT)E\دtD p'Q`?}$4#bCHwHZfp*cˡzH. _Fћś jSݾ+NV5sׯqhIC G+TeVPAŐ8\-Z]x"u(k1U 6#Z]1"ڕ*ȮyQy N,Qd廒El;qR`:0#h߳v/,Ih_ !XBwXBXšqi)i0 ">Mr2(aH.nyBڰmI_SAdǠI.̂T3!* {Uv8.qTl?rQ>HVJoQlzOpyJu3UJHBȐڕD3XHwK9*íHIf"kOłK2'VRQT#yr K%DY If;s2sr-0G<#FԔJxKRhdN+V1$ŒcQ0FPn )9X3 ;aLjpY+ c .K\YswO~=sb5EZNTkR3yFNhGH>fz%, <Heé; *`՚/$i+<@CIPS!JHUP3gZSmhpӓՙ"X?LJ30i{)' V:i3t`{0^# $%6̠R)۸=sZciѹ + &x#=#GaQq4Ϭyܰ6l b *qaiht_OzP ~\kYg#{F7JWy{r"Q6Y<,W{wE-1|gUS}fj߮͆>3_ 1`/O|H?}6K{6d|N $m؉1EmMTGVPQLgjʜ>{ȑBvoxp F/ARKhnIԝ|uVD\ Fk#2{H/YC &hwqې  1yT;K _0Eƨ|<[SR7?!B5u|ӝ"VBШlk$sn=>j$oF'9RBfTz ´!VzN1nh5N:MaC)n^J1sH(#[AcDBTVkmqy,ݨ^B9[BVFdȅsE#ilT DjQtTcX)ߎcv*)w D7BB`@&Rݜ({5 QR y*:F5@N"0l^jhʉ)Q#M8 l՘}n\/Gy´_:`Fw:TWX|( ?V?>:9l&gC{Y2-ż^c2UP T3f<|+^01jcja^վZcWB ̧v-}Jr q@D@dGN 0-KT 6Bt,S nM9LxY"&0*|L"6@pyvs<*G4@\HEIO_E|mfIs٧`Z,ѪK[Ҍ0󇙹H_fߙW;ok o02-gi9eI. Aқo_Cut:<]흭1 eNXΛT+G^gg'F·bnzf n>+ٙ} :} }wVf#WuU໶UƼy$:HuP6eU>Mq: v[ 2* 9j"a@`DJpU9FY{UY)K >tonH䵘[s1䳂|0B+e tUX0e!FIG^l;+#jN:؇W(Gk>J{Ү$gd *DM0G!%kYU,>X+}Vie` aB*⸐Eac7{Ya<^ҫgQ j UC:(WCڬ8ahC!F`9e(fRiqדPe.^DDny_?_=wȻ 9WMڥGb\ IeA e\0w xJ9`M^N4Xax!ף\*r F0<yp9k-0S'mk.dhfKϟ++uR\hEvS`svBGcvǓ.иmC9]_ޣ&lb)i+3e["໤XI\[>gG V.rsS9cLD="9(8gtoF+j Z;q> ن ԴFJ|i54Y8M>UkFgظh/l׎D?S(]"[(D{5jbK>IGF ӗT6$c=Sڏn&9GĮJ1aU(b;8>(#uL&pUcD=?&nY<潻P@Z`DŽk 18QZ?Е)m,FK Q ځp#2xSv|-y=-+ufS8o>ޠ05(˼!SYձ{mF \FķGLZĠܧAX(߫@a [BBr~Uq1>/,~z83ٮse& ovcWi:9 ^3N:7شlGq{p.黭>g(a-KEԜ [lIfvvka_>ϞTD~[Iԁ͚7 o½}_}EE!bQb+"!zS ^Sa4%[i}n]TI:f.TǵeH`*ɴ|-@'yHєΎ9m<WZ;j~S#RGuyL s0W$q2PW!1'@NВsm ?ܵSYr7$fM3I/N$F2 e~E4C$bʼnLa(I(6D.1NJ!V(gBXKZ+؄&9q ;#~VKhfbl31$x7:ac1%ֳMVx٩72~t 8t%JeMn:an bMHq*%b.UB)P.AN-<< Kp@%CJ02,`LqXk*.݇xtȢװ,YhLz+!o+͋PU8 VoJfnY?(?XŤA#()dT Aqj JTrA\s)hS0^ Q'3z.xdk7[[od_?uM|wt \7ɇ4|3-9”zzԛnoùWbG4]́Ef]+K9u3|+Gg8?L1h^B%c ЦR%4-.!78$RfH4GJp"SY#Җ&V`[᭦S$Iq$J65II104U\X!c3Ɔ6&l{FRBg'J/,DNa> DYakꬌo2_i7 fbb0hJXlSqj$2,C&촤q0A*)k5uFa2d6 8-̕eLsaH$aEݢ՜KV^xP:V4%: [rg:! iXi `Wc5S)Fo5 U /{zՂTm. J^mj1V#OwEeckVXya](zX[.Ob3e Zޘ.[=)gx߼eppyq L)҇69̄2]g2}]~@;(sq7h+rMJ6VuT;X%wk)Zcw4n'3SRٙ;wkܭ UNUJ 5vqۺI6VuT;XţNA+3mlVD'>7akZ;:_'+r6:l΂%@Z$IO(@oUSrx:M(s@ 0% g[CA [C{ ABΞ6CPm`I8nq`Te̖)s>~]2Zo}Q⯷-r6 :z9R$O.bJinnݖ13Kw&2旗y&xm;tc;[ɇz R1;煇">/I}Q?3_G关e&受.u+7O܇ds;yš٘>|:3~v8zy|q6 .+ pW&Y{qcrvsnW>Igp},>|R$ |qIo[v Jfݟ }Hq:ˣ/sfu|Ü0+gf׭W YzQ0Q"!t@ 04 S@2upiҞܥŌi8.- tMor͏vJQ:Rරk k^zadx6ɮJÙƝm hPS~ء(%1fWMvQ._ϳ`#<;n_LO 1 )gA1ch!Qo1&Hs/u㇍z@Y=5 ~9 c{c#I9qDh؛>V rr$B09g[y q먛zgG! {cP2왩6CT!H`'`0n%0Ku9kBDeei7E?QD짗^K;J{waW x]ѕtn `i=ʞ1K]Q'' E..7^-"=VBb'n|]Eg/4.q}PUl,QI (A\*NUeIU)9JJ6XwS橰)eH)2 ~ #ܪBwWuAZ R]IV-ZHSJ66{]CVּN4Krwt[$t8oT !/ 5CBF:;B88ATFLҧC@ vlٴ"Ƌ?/%fb3+Q))2ܽsp0x9h?2Dpe8lėld(lD+;6F\c'e#Tab=b@wg`4 z/eBbKӯL[ =gԉYqXS,u<3VM2ܼT<=2;֩7;6)JDy6 Wه ;/X,: N47&[e|̼/ G&]UO[_n"t~*^npyL> R1;煇">/I}Q?3_G酳e&受.u+7O܇ds;yš٘>L길o^<٭r^|Y%ѷfv/]2~y|ĚpW&Y#"WtncJ> +Dj!'ͧ6{m+F1QgR0Ɣ\¹a.$%H'8Ӓ+mIb%},PZ`Ӎn 2I*(U$()ɤ\X?LO/2##SEn2`|K?I>D>*CѢ7>;A9zMn(?7Go|՝Gov@tTT q&1,MxJ39Y##Rc@)J43س ~Ą$):I_؍UusgC]\K%?ٟ`pŠLM<79` <x4U>0' –OgǏZ=y>?s>/ugC:}_vBv6~HX{H_$j/qhs ;Y<ɅH͍J%qH)Т3,;nc|n{ʩ BH4It4$^,+)Ĥ'Ͻ|^4M`Yň.8&:<"OaG2e;r S;LفEQq|s] 8&5H$ ;rDDj*XwARFcSOm l+aۉכ47 97͸rGM&DfwR(F|3ama_rsug7 ܚhG~S>zdk?-ƴ ͅx9:BT(f$hI`u6SaV=7Z:һUo@a(*qԉd2 p*Gzy1j|%1y OUKe3iwL|z3*5"Yy{w[MX業Zyȗ ~R֎L+ƣ_)ܳ|2_>EAjݗj`Yn?bO&OuAbjk'Y8km&Obds'N?` d%+&^{9 |e:T?d>ͲI~;d־VW$<+5=<.^myKXO5 Fٽ1)  NX35fG`5)30b]CQGjN5FQP4J7K/~BQ.g-w>qҒad{ĜFb5, ? N8L(%aobYCPK*QOs֚pME(9]--%m,: ޣ6yƫPRxRyݿ&\6Ovo"a˛~^lr%2(YOaM{Jډq16;j-5*ԮjȤx=$8ERG0T#5sDْE^7^W@Ӫq@uO~,I]D{ƣ垇{RMrhg-(JGP珍'T%IxT+:&}']#1\}UYq'O N#ygwۋf^)DxOpkѬFp >XIZ- UPMnͭwoXk@0Ĝg_7_-v{l9 ՟ΪlͷolSR?mlېZ/nYpQs"Oiݧud^6y ߡCMd~@T"Pno&I6P;:1H0`Nl9-zxYɒ:a0q6Q>2BH|L6vοNՂ2ɽrZ?#z5]HU(dPぜ qj.tvqeV2xf}jpA":$_-.~+/(̺$x}5QBКefƓ^瓲v0gG{Z^-WΝ-dw6w͇[=.,.X.owFԄ`W ֊ J{վ&%G%j5f௅P8Tp;/M41*T3ѐS1k=6ʻUM:sd[c{mO H[O@.HⶋHtWks˜;LA(:n,܉"BF,$ۊ.%3fMdH ݰcAkԣ&44:5XmU=L森7B+bIMju˒ǩX ATT+VTP%-rn%RD/9tcb3=eATd=!\dpNAZ?Ւdn68!mh v&/S G:;v.7UUs;M;Ep.񂝍۟v4xYu'CbyCzx5>4Rp Uj!iOX E}Dk%srRHjmwX``UymPw[]kc Ʈeqbzq:C̸沧j9r wT <)ƶiJړ`GbgszPE }vgǏ2 ΙrVme7O*E<51Rf0)=rJq}(D>4PySip*xݎk^T@ /1+coϵVb;5L b7A2>+Y'Rb{.?IbdzY&O?2G=e*Ɂ;#ݩN Q,{ }W+\=v)GNp3&mqJ_3zaz 5}_8`;&h18h0[+ǘFڊiY!1y.SlMCs԰=Q(G!>@ˆE/b c 0N?'?I0G{b(D>tNx^z ᘓ[rME~R||*;݋o@9fרu#;NlEg<6p"uwsN6x~c&C{JGIibkbg]yT>g.HNh-5"/BEN׋v(cI{)(CO{//7qAnmD5ϽO*Y|6 ¥+<5ίt(VmL`٪P6Ǔ pG7/Mrg7gsB}= ϰB眧8Q:C$E&E8O(3Ɠ'SX҃>3K` L%(CqO@IJ 4׉dJxA+z?/d֊D5@kX0G1KB8&Cyǝ ҔcVjy?aH+L9RfY{r(z7sO WBZ9&^Z!g6l\5ΏWXk9+G4p\6`,昡㶙~=d1!6iO1{ DI@*/ ꈘ12#5;ԢtLʖV[lz&C\n;ptqV_k-dy|k80druxLaS`qR~ocXҠP TŚus.f C3FwH[{(\x] 6ܽHv̈́ qlLjJjO +nnk< o fPHVn3 lbqs k:b;Z׈-DuG`UMVs0Ѥ[|J3=.>wTB̝P ic.MˠݚTj_;$PFqJph!g (VjD1i8$1A@%ZK[QP)^ZƆlbj?L`O"c.4AMX^?(͙4gDD}!v6UT@ K4^I9Nr͘JMc!)JVf<*LǫGUUߋ 4Xwo' o6ܔЬSG~I_Nif>xC/YVoԆZϣa:{\LcnC?Sٖ早Lej\xh}՚g\5*O?L*IHsG3G]L%볮䥅Zp&WVx#$|;J+=iYxN68`CdJm j v%p͈@ڼ 0j@ Clh\mʵyTX lMfケ}]t+- qĕv`Z֍DO2wcX(⻱&zY0o,.OZzGu«}3]ӅhS}MȴKr+ܛ)'|T0p @jv fܿϕDM3@Νy;2$MUfoLC3(?}992 DQCIa4eEQ),\Уs [q٢-,$הn8`L e>H`8*x*Y*Àe.YRrD''{=r9#VyМ\b{27+hʬ3bc-ˌ%Bf'j_;LOdJ '4HBY AA)^ށ\pb"AI)A&P*JTG"#iD$Q M31= GjJ:v!łg2 ǏeYQ ovYy px4IICRJ0@K8JY5vQQh XB EZ=9'V( L#"bHx[iKƠ05[߇q` G79Q#*".ퟃ7vzңWtG1ލ;wSH+}j]UywyYV!oYq[k̊֘3_cl5ocd I)Ȟo5jR' hvm#}?Ry如 n69\ynYeuFIb23F80uEk$~wm~ZpGJfC!:xnw`3P<[t&&Y6#5f9>adվ:,z(19FV1%f}Ҵi񞝡eV򱓳|o$搻_jitwK,i,iܓmnytGyׯbZ4TNh J-DIЪs$SwGD-urA?Yٚ7`%Qn}spQqK[߲lo| 'ނi$g+-]c07Y`MZSp~FDLvo =y̡.YQFEjp&s,Kt0HsOSޭgfjoR7[mФ7hl '[*fG}yӫ7jDamTC0,ɀ( U&a R1BR m!Ghg(rãEqM pu@8PT=yo. Zx%r?Uq”iSN}/ɪw&$ TqQ1xk7;b%x{X B=οh3Y/?i{c9 @<$SGt˹ !`.'8jˉ,z&4R< Q!!qHX9 4w0[+"ס[CH0NHaw|2ؐ,h JU a][Vkxɦ';նKOڱؾ@'he+]Q˷w/[ÖB@$w.(@0\ LZ5L%0$zfh@/fFQ3bZۨz(&źm%Oš нraٱ{9H2{/IxڥJ$=75_Mū;BSe^vԩv*ʸu;x:uDex`3VE-6} T-̨Kc%iac$R↝(mO$nS, 280PE$F$qbBM3"iAĥ%{jWVfO.} _'mp s5(ToS2^"9\ȗtG$}j1|-*I  !J)2!"E$ a!E#K@}B1C̃Ź3&BuD ݼ%j`x:Dי?@3D<M|:L O?he7M|r&zZHM|1ξ懲KBKJ7h'%$ -}v*nZ6ɇZi]ܾ:᫸=)|.XEb~qZg3Ze5- 'Ұ @unW0Ú&%Y0odk90HGo׿G(._}m\2Vίwg6Op'0i;?d"?)MBTqL8S,5KD4)`*4u "W_Zc=_S&xU90қ: QG^Kڰ٢v|!3 _b/tS4%$Z:d3}PX ǩZq Ǒ$XLdnK\âW}Av`⸚b^-n ԡF`?a A{B+w*p-$4\j[!i!0UQ ]O 7[hB\/ 6[O6_=Bª"7w RT^G@ obdxoi̟Vo&xhn)&io+@9Tb4>D,tRB#tX[!;Ig!Lbu(ѧ`ۣЁ@ eU6VLWwƎ[)VF.Ί OZ6&ZhAES#q!|#8$ C]n3AZ/v5wk":@`Yql}b>_%3(L& *".o7yڔ"FOsxgc %iBB,da "Qx'0vZ1`^O-VCa\ֹ/=Jj / CA,YUݞS5買,&mQ pQ-Œq/Is'\4g!zt{`(4nEh @ !G5%BLS&?*$\A;]m]S.߹=zZAPqA^Y2s+'r.vnVV@OЪ"N~kTYvEQ :^-%rVj%rD:aH{+]։Q<< _B ώ7 \aA ֳk j#.[2.*e7*gj57ʜm%zk ):L}r~!V$ހ G2ϟ7$^-ƚ?jde/BǧV&24a5R> !):@jV1 2ekI ED,PBqsaF(M MPt#Ft;E7(UȘybqyZes٪|6%`m?L7i;{?'8Նni?lNwQ~S)).fa=kֻvhY kpgVIBu,:'ֵgQ.ˋg0gz Žm{"dOjn*ϡ`=%ZrBjX@ X96Z*O"eMii.U u] F-""#R(iHƀzZƉ1IӇW6/:XBuHHq[vh\(mG=d_BηV^ ZhU/ Mf6۟Tdf f!IHD).y'y;NqU6k=U純c&$VPZ&~qQ;ZUoaz q_JC9^**0;>;$YmYޠ'VfROwe$G/K)C@?,f‹`_Ts$gچF)xU̬b13$+3Ȍ -7 /[(85½S\oɢN|A~9C%P]$-gͦVi=&d|%vXz(EQ`Zz׵fSQ]Eua}HW+ 4eS#= ՘^.lsZ)1B){]kU'ڇPg၄8~K;[K!gpA@T|J_}w Z+5.:^{߹G`b0]S@oaWHh tMNd?^/"%nP*D #u݄>;uqQ`|4ҜIkL\(Fxp,a=aU}{/z>y}nT܋H<K z=3P!)Ք8&ZTl(^f㧛 !]5S-%='?,MB&` >ʈDך:D"Q)58x*)nѢ*fVb/N8N \b;׎|, vbZ*9L*]mh⛤JfN9F0;D6 >roBR=R-_e;ʛ&X_|8)ң_Mzxu rOiB!d=ګ JIoV8Vc̛!M#yF:'/Uag;ƅGFt\',\l7Tnq)y̭/(XuE>qP 5GvŒtMOk ^Bv=]Sa95 .A% !~T )uap!ܩ`(DK3t\Ң 9 kv \ҳ5)SZD ׵g1NJA;#ޛIoPXŏf,GIrMVPtL|Cnv{$?j9P}RawtSDLJƷW}b]uu F,t-˜U=~j )vS\ǥ6WOn)Qs( ;N{*Z:hU? -nfJMuLW-t9/^Y^ngjᄫߏ M?|rT M<7-"Բ,|vS?zIff௟Ǖ%j6V=OݥYXT_LXfJ5$"מP7bߔ/XqDU?S'?e !Y! hMI&tQD@ ۿBH_ muTHnI[ꁐl=YץJv[S /^ ! C 6Ǹ$ r찓 zhs!bp5DjAQRkcQXYR'=LyǪZHK϶vWdL챵}V#@>PJ9EX:WZ |{=TZrI1I^cl\JL8 #2}'$ R!xZ/X4y`=,ON$I2'q5~/qb#m?c^7h*%$T xߎ.BuG&]ƂSK{?s~fYƏc.K*x@F| } LN=nsFif*l,`9nOEBsm4E,0Y325DGBvºeo `@<#19rv#=`8kں򡰢 7z:bF"]]Î`U)kQ6֕]iC0R'K>,8mOrQkV$͙, 5* FZ!15._i~Q`ҕϣKI1G~`H73N۴Ӆ'Hc?AB17;eK̈́|3A^/'W][O_dUCeYItG,?+'WO|\d6vpB9єx"+}<` ظB!4Pq$/&ww]f܆F[_ ;Ae};f%lhO6o%oy&}JE$Mn=#FB$ C{&4o5: Ü)ikG!GyC!=AԎ_NK%'o: :c q#[+k72[ZӤJsY3z.Ik % Ԙ! U~M: HA[4#]!GiiL p@8㑴q!H=DPCJԟ+I?SxP,^(Ztò;ܲ)M̩Iٔ&-c7{|yɱz?~P#m4DS܃ONkh RBw Q௑+7De3]S4R"=CTAd&bRm|̗lLķZ=sc%۱fkC8xBV5l+Lz#M@V+AoYo7Kۍ>ľ]k_zK$f|` baߍ3 )Һ'Azw /R-x -9]ɯrWAroָV!ߍsx*hhWϗMa΋6M_NAIa%EkWR}w`=_QH%бڛ p{^kZT'uMS|Vo:U?ff^c*αAةrT~Y3mu0)MU'|r}mU 0~ƍFJ^ _.?H@ߦY\%|PwSǃ8PRwcqQQA@jib g Zoh0п,6(+RR^(:Vql\ǂ߶ 7L"n-J-SԼJI\Hǖ Jht?: \c"|58,#nHd( TQ[ oa"3^EPC$z-L@ 2j-&Is7RSeٷ۶-|ӞĘ<*7zZUkɻuAG͘(+1Ds-;PzĆ&u1cLSA!QUJS.$cھwB'=c*z=Rrd FQ*BCk0@NdLn}(g\ !f[S֦!¼HVN#e1IU=#Tk#$ jO:eZ'4V%O(,лXĥT3 1 v^1b"̻C̡^`LWDZ@|x͗_MK!) גi,(/qw@9N 8[E!0,SRяԨRtCV2㈠I!QeÐ#:-2_T\~Ÿx:{xP J0yX0{"MQE!鶟;/XV`YBHx$v!l5o/5Ȓ4b>Eս8eO鞴[kxٟ@' q-vfv3 i}!FfrywnBSτH6KYj݋{ .U͵H#z}^+U&蛲q3 F9Teܛװ Z_Y^*s'3ږwϞ6֟=վlZړ\7ܘyS(.zv}2NJKE%7v'Z-e sL!蝓npDzNZ;{wqHf o﵌cBGM_g1`QVTl{WqCon(gp3L 1e 3:N@Eo5b)ڣQE'w!ڜpr&&I*ٶr,v e ),.{!ARgԨZ"GCHtOtq&щۃB.q~:N>: d(s,nܑEf +^lջ\7"ܵźo1/I=>:GTZ0=|7:Dz(#]w~P7|]4I=u=Qڢ[RrS!#*O]^$}l>̉*Ce+tOH{΢9tqU..S> Qjq;zMN]$U07~>Yw;~'=XHN$8Q*KcH*8ua)4/i5H:.J)M, )"U~|:'Hi˒RkɒRTd )[tT.W7wiY/x߁h.a3KH ݇-:+\Lҥ .] >-KxzU0Z.((,K6 @р\4Q8^&PT\)(-rli=BHkړ/( Lקkڨ(2~j"^H^U9F񊷳>MH<0pi oPjQ$GYWQ@4GSI(4PЪA_{$Ot|/e=Rs6FXme^]=tG9KKə=3c^L'> l\TVA~/t 5uڽq3Ӹ^|^b닧jc(c2JBɌ% nfBZ@R~킼IoVV_B${?7e$k4=w~x?>5ntE>~ T1 z"-U2y r* ǑT48oy21f: 7oAQ]XڅR㷅\>b #LW.3ۻŐao~ػyYNߙ31kRvj>A~FrfVX(\}brҁ1 s !4r^ pp6$ޣX!)au6hwVX'^JaNɿ_g~?oo듰Z]N )kc^~\))'m>اەYJY$Iz ΖI2!h ,V2y"2=!r!A`(s%ej;ҮUⰩ(bupI`\_Rƽ]i (>%f/ytg`r õdDH#GM߻4Vn@'?7 A p<#e1aQ Tk"=')5: B q)L8#HZ`5 J`k2i0_!#e`DRJ0|ےmdMEs1 7(rUd`Z :I`%P'&t{CI`*9k ڲFr Pч<3h\׎&Ɵ1ܕamW&~M~tmE⧇fU~L h,\d:u= W>r{-K*~֛U|3y@%>3ڱ0(1±D HQ g"d,1I '2,c.;\{{]ҽZwﵹjн}.y=.hC1R"%݁ũ#JSp"iu!CA)1ȵ1pclbJ4>7rtv1lb*ł]ܩ]:_X =1h~v1lbFŌ賵egknŌ]Ō<_ >R!9t[g: #0Y=TA |چJ]Z]Qx]1=[w.感]):[S~v1;6tLJIWc,]` OYLl?䗰#>1E:Y.B*B_цeĬAg*gw?d[p95 " ˉ?w;Wxy)n6#fayVG hޥ)klc/D )@T Qc*~}O: &PMԗlP~MJB?;\)M~>OŀgI+*mՀI q,m㛇 >#MTBD8b`MIBņŎ LF\C a7TXx#p$PM*)D?%Dza <8`RЦk%!ŔϚAEEe[`⬣ U5|kX@n[,EX'v2ڭ˼rNu{~[r,SLu)YkǼ1VIt12@9uJ bpa`clo1;J1,r+ju|,a>z9Z?͓lyۄ5[fzϟ?sݗdZ&n'tA'~ʦM[͌D̔WbƉj,Lit97 }/Y_ynꋧH) ~sя2X&yD`6CfI 0%l3 1yRg*SBRN4Dj#-\a BuZ1q w-0%@b;"ryBCW` @gnͅ"_yH?~g6>FҒcq G$)7)i,gnN:j3;K`h>2BLLߖp利Xӭd&<x Wd}D%gddfY73!`-XL }A(. 55x/@P&_T@ijΌ9 G2%G`ZQB ÇPeށ"ƓPı)WN(`KjG q!\*(g% A KI- #DyTXDaCNOB6 @Af;a p&0';b t-S|)p8A=C p$ʿ6 *ʼnނ {F$#ndbHHkx$cd d NK4>,XoΑۄ[. "a:xB`Q>Lhg X1@cp;O! $C30IRT ؓ 32$ q0B B2^;Xi@HJDk_H$T* Xi &ьQ8,rD& xcFwd3oWKҟU0V^OwΟiӞ4oΞk_FwW7ffe9/1+,~x7C>TzgeS/ٸ8)0EWxV)qh+1_h4bgm\hWLC!"Co~yy .*c//Ax_}TGmn[HOlC7L׃ $B/T~sa 2.& kmM>tG#x.Y\ѻW\>r/fro0g?pYx Ch4ijQ`.rP^3CߔITEj(ۼw桡ڭPu # 8?{ 8; f(\_^9f:'n37&0bmה($ -< |`C||5 {q7>h4qwc?<#uxi7I p] }ÑnpF!CWzy2r Gݰp3'"`q7|w7F߲XZ7_M;T2AݞOl~R(py9hbH}Rl_wH.We{6`qز61΁瓈i9xGRR#9z%Cp'c2ih;rXmE'p;';yjQl6k=V/l' XLVX\gNf s9X .&3Pii ,T6B܂k=؜&܉$_qrbb} _f͒r"v/򤋴f`Y}uqg=9?nˈ캌~]o5'ɠh'CC*Yץ1CIy5/[a~h'TVHZg~rn=W gt^~la}cBh".u>Gb(A[UeRv*;ق+%F2п]#\'rW;Md^mVudUٖU lx[F6AKzÑ}5>)S`SgvaK0Α).1zL03<{`|X{/`eaQYGYض<>OsMJz& ?~~Ts,1_ÎR][q׶ -]H`YoiZ;D -2SǝPfmP+)4R] ~?Qq_uF_:"`cabF<wd/WSt4H?W'#|^ׯ\֏]P\^Kũa6l:M/)Ls7 A+ٍ7MqҮ鵦nZ"2Cz ؇$zmM!E%!PVE3:cofFh/9>h]ťO0'l¸7`ѕKBQ"Ykw!)`Kb;D"-TMU}tԋ m>k") N]RkT[si[:OR:fD>o Ÿ:Y@\_ǟc}N}0йzO7ݹ)!eǡv0uLWvzW%_;>cDĖ"X9OBD2 n:&‹fyŒe,*?~RHCx{8-A=_^:K* >Y烡xLfOQq!OvH Zw =rՃiAe o8͔K!DZ~$\rgF.5Ъ滌"TcN@RHY<؇;i1R"5q6{ ]) j^YU4ܝaV*T)hjDxck%;bg#Kʯw"vOmCxgRvK#& _/}s~[W6aغv?rU*jL/7@ys=#dg3€Ӛ]e:v&ۃEqsW{)|Lo࣫j~( :0hehL80"BI=V4J}9C.<`8qo;o$䤰,U q_ibUc^ئjfD PP[SwY K-͕gSJ89XFZOk(*"ʬ[~򕤹T#Dp"I5]WޝaUxܬK]|W%=2]3]v2th,jOw¸~[}ͯ3By$=3IBWτtRnPk.ut7&!'F'X%MU sB|e;)!CdZt l22j~d'1΅bU} \hr0Z|tMx'ke@0o7zUPB_}ޫ oxPɤn#rX7Kd\O O7rKsVt='} k`9١:GaX<6)} <.-’qP>.|έIu}4jJ'ooIV ?c?4q*Ҫ iN ̀h4fmNUVH&e|@,Az}Ma2BWTSpFoNưUrdztGRXs#OXHI\0 N40g,"Ur{u`C B9Rny‰=Kwn, ;h4Ǝ"Tӈk54 fޒZI]w',fؽd$PP.mt嵐,T" Bx7WHJ*Ė "HbOX< 4Rd ydBH4FlN[n}@0UMgJ3H`I 1&^I)&u7o,Lfe^aHYot sSb1%6f}@*㥹GӁM;; Uvv|`TLa"*zl7fɂh)Be`mIӢݥb-c*ʈU;mx`f5X;= *ijGD`+~0'^S,$p†" q~beRhNLCovjvc%:nyjDxL6lW )[n*PCdSEYؚ\3F8/k"J}0Uoގb$\ev=)b\(XK>e@1#Q$:7Ц;17Gg}n`8-۳AvZ'س4tʥ~ ye,n0l[t4^So&>WRozӛy+^kF+W/@mcl{f,/ULs@o ~cF+R[n}wA @6aܛD% (ΠtcG;/Of=@bL(y5rX›%'Wۂ_/_"#q#\b|"c+a$[k#y1K\<Շ1 !iz40sTjL`Lpkx8 ϾXPBc9g9_jK+x6+oɒ|9ͬaC(1w!*YC…@DK`=X?GʧIf\pK iU]~Lbfm}z1G<")lEXjo7s~;N| zXc=s_g_Ev }^Rˎ,&+t,!ɝGW6@U$sܔ:ƞ O$ b vC<:WsoHiHŶhxIyG0#8楁ZQ+p&՗@7rL-;\ bIJPDBZ|Q) S7P)L:78&IE=)c!)@М5bIe,,zW!i 1Ǐ74Bʇh)(HWnR t=@*iBm<̨# ~4w暘$CJI^vSpmîLvfJ&3/Rvl&%]8$Ûs)|#J_7FC $ޅ*=ZF^JXΈZ28s8q@MjqܐC%Jd'Anհ':?bߦjJW8A}So1.os}""%HcϨjEkƬs3x"9GiqS?JOhlC jؓAigZ+{8~KunX7_`Dܝ'T2Tǣ]|{.X}*haR**vXB yx:NgMHUB.3jg?zqˀjK_Mj9i0 RshsW΂8`¾2=QuxЉ#v?~zXCE_"֢O4 {Xjop'giI謇9Dza __ꨧ"nZBJo-9i >tՓ霡 /uygZ\C <>S:b+հ9<3+AuvᛮI qP-ηQl xGt e-^=+Z dN0PRAi7 >sZ}j0 w1}zy+7˯ޅGaoW~Q)E~HbT9L7pa>/QJqr;%znpj@pF+`z H=)D\nғN'Խa{,nUWvŜ[k{aiErmrdY;O @?O]VF 9?V ]9(Z.@`2Bw9?_",]7?n.݀~NtN- SʗKrdM2 {\WQݾ:"$Uý}>w-14N}jbg43t;+&n-֬N})T23ZZ[ExHx!K߾>ZMٙj6ŷ\ 7N; [-d!٬+loC=Bt4-JGp oPh[~ }ayE-mvͲ\1v  {( *+ֵOlw$ovrι`+ E7ob FۇXkFVottsA15 D_x\׆]srskfN!X { e qEƅBFƢ/V1"Rhux!%Y30arN70Rk- yBZUZ_/UdVmt|*Sr#e Iy/zNz@I+(z4ª˘EK8p=7kkO#6' '+} I6H4f)[@8ez'??)_JEr&MLGC(hx&ś7)޼IM5\'q84* C<< "'-ⒸΙH{f* :#秒oȿ̝2hXDozv{< XΆ4go)n>; 'ߐ)z@䔴{o=cS JBBhLސ#h5D`y s RVH 2'%;Ap0o xwi8:D0T3M NGUz!t'~~Y3lIQ50Aegl[ 3],$Xh] ˍ_$kěěěě*Xhf ``24B Ldx"ShY!=,P`5t 0l$L[6n5D0a)%}>@ 55Fp,T093.U-̠(`T61S1 k_%ӖHMUZD`p)UtLOX3k_PGP޵R54QE$Z{/BuP5Hz!vAJ9*'9nqVrGmVRAu#JBT(Oa)JUЗHFDG)RJw^zJP Xc ((9wߒ&hA^}հ*ݍk ݚN>U>_DACCM!\d*8XD Ƙ!Hiɺ.C~ݔ ZyB YtTIwlbӋceByJ0k׽kC\?_ AGHSͮ2PL@M!!WF$1Ό/PaF(JNu8hjξ_Q>PLqÙQq9`RA8^ k8d'cNζ!%Ү<|=cJRV3>@gA^1Cby_:3=at ! "_C@K$ .GbTEHFڪMBAԖ%Th9K#Tr]Iٓ&ojchy%%HսfSqlء~Z"; %sBNH CESj|Ե21ѣdI4ZV=# z&0O4F?C BG+cq-$##֢IF-4#^ 7nӍ̼BLñdhtdCTV+-QV.2U8` pj2M,@j@m1z 8" @iB!ㅖQh&}QÉf xNYSއ.7TNaFZGAl@l0.Mia אB fLXc5F!*#9Xof5'j=),d."R1/b&OQD4`ċmXj=rjѨ8 Q*EtM NQx`ӆHA Qea M 42]&x)fJ fP)E[4AjpCeM l(h!$ y f\S#[+^M1ĈUj;`V!E+ }If0tБVĊQD+bM/keL ( %Ej1uD@n+J}i,q1y2:/!wW>}v4#Z`PV^ŤB!rOY>WWk8 j^7L/odh{? sʜnVעV(uګh K+ޯwAd<V=ujWMCzrg⋏}1Fk`k;ӰGwC)a齙CQj܎<ﯷNw^/'qcD:9O!Y)ΙR|8DTD?Lg (vPZK},yZ䀦? ^OdNT:h-h:.")I - 7?)`7& pL$V64Jf0WۋuPqGM5r DƊT@m ޭ[gupDA%J.(Λ|!6c~,)FFeMt(^j6톑+4F1Kf%%1(=t{8p:V+m]cG70 }Sb@߾{\;@5|2Qkx"`JЫڍzf^ϳ1:Qe_SڎyE>h33Gd%șeNgPa? IׯR/UE^<;ɼLrX9os̺ZKXO9 *bh5J5(QFqI0z4H۪@f_ 4]AT3n,%MJ^Tγ=NYݠ&>>VG0ݎ/_QԤ&T<6 O9S?Hpe~i/o6CZ5EMX& qZ~$ j$p??RvB;BJ?sX2cT7\z&Y+-I7./)|a;܁v%tk?^߻"8Vd-C҆}76χxrU.{^\N.GCw2a{\.̣ ^hE:"UR5tr+2un_*;(jY*䀠G >XdMBR1AqtAPO3wv7uS=~n*%1YD3>Y;Li$|Mon(B=zT?#cǻ[^D?DǓ/UwpeŔrѯg5>Q}ӝ`؅X؇ѬrNދ_Y%!{  Pm)]`pepIٗ>~#~, |i;X>!#ɟ9B!ٺ?=L {83~5vxg[AϬHPVx\]=~ޭ䣔b3?edGlyywi AG= kinVEY 9"{ʅDvIq|}*b(Q<84Y^X/ΠF?>{9.Ec~<aJUQ@j{t`Cnl፛A 5N5R&΅^HhM4Zc `i5Jovߔ S܌֧wyj0 cQ'Q r^x&9s>'7qHU|VhV`4ֱVuP`7Xm- 4lg9K-&)*_@מ\[f0:c4 p t0h?ND*/]E|F!3sq2/=g^*ᣌ 3 8ݎj}i@jjTyG-XS,V/hպ4:%ȱ!%HPNɲL]6mNu0wF޿J$lWz}\_ۺY}#նNe]v2p~gKt`p&v`$_ ; ]Wɐ\dN%6;gETS<7n6nq! H)v /&htEZϛnDϛDLgvl!}m[Oa Տ̴,P, ]wgĿ纟 : f3I{pT5|amڐ:j[|H׃Ѷyb|B6-OcZ_ ڙz޹Mt+aƪiӾf7ֹ (496^Fh4Q((yܝmcl׼o)u\ᥢV}>}OUu;N`x>ܻLy}F|:St̒qLg:=7dcPZtr G.l9ki6z%Y&gQLS@v6vs |7 R2/&cIR\{/" 햾qߟڣb-@랍-už *ES}--/h IGt[lSш(Id1F,$SrVc,KLf[ۋۖ놫O' }ۋ(pmuClzջ0]^C6ӫ P5iJ2Ut*sј"XP6++i6Ik]$2%:@ԞQ$Hu;fmU(-Y-]ZcTUqEzjMj* M,V@(1zݨ5CXY j1FU7m6Ǥ[f g ɕxyl붷IL چl0uOmv[EOk5R|h@BىDo]Y«@Zcx`=Xh}}^|0@9dJZqZpn;qv͋SYAs>o4<:3OzĖ(hu% z%l1A_.&Cr5{&l*&䙵wHdvX:딎zI-蜌c2EHh [eC-yA$]fG#,Rms|0^k jbq,(]ObOQ(oPJ)N~ 0Pð^~մ תg8|cR@lM ,\CYl%r6n(,Pk(G*:صO<|5nyfYbq٪2zSv,r8)H0}gTO@U=5C3S #`UxF=zĚ HS C1'.M,NIJb \ H cΘ}|onM/v v$j:]`]-Z1| KlL{[?m%x#QŖʤB͎w<X͞hpD\D>?msf[\`n:e-BF <@Kr!|jt8.яZΚv.tְ柕Z9Ϛ@% ͱ湰N\ Qxd!e5ibwk]+bн/tQdHm5UV,P9^eKZZ&DED'#F}Ѡ{C_o4h@h\{JE8A,=&K2ՔMq(h5IơBѮ}#8?0TA]y 2 AӁ"A"&`Suы3g+!0849;EI|#dĿ7ֱ\I`1Ƃi{NָfzL@,ZU,s8]*flFJ$KREM83 Xag99/iAmZ0)q9gvձKu6s3M1I,l&Ql|1[I]?z)z2XPst7!ao^ rdT$K2Cr6 B=ԔKcT^u\1Jkҏ` ӔY 0eYS9v L4]M^BWʂNٔZCF ^жίj%d+8 aWז,=:"e#6+EYB Z5o 7Vlίk*n;ǵmW-v3IgfW,}$\4+u{{v|ʆsشa$^F_r_uVr|?c̷͝On?[5zu?aEAvuqw<\~_Z|$߄gCla:$_4j9>WPC0`zN !tG .k#Fi~ri;!{ 0|DZWwܔ@9=?Þ#wT@+{A x'uc8a(׼yjbI+[3W)>&C2M 灛k$HF<}N^/jۓj/PJ` X6ḞĽ[Z/6R?l6iM4'Óg;ܟƤX_VŘ8JP5j,ق 8fxˣPUWC-Z*uj;;@qU1ǵh(y }v?mjp2>A9@]͛Khp7-MUu;*/4RcJ]h^ cU*_"ay^u8R(:wlOh]F(3cYnF _A|tev 5ـs̠ =l`[F떀:w wU-ǻ^|kdG|"465gQ?Ofω~`As>geB*9+)7bq[=@s\!JLv +% %'uв%'\sT@&ZFx=Y~\MdsPLXPˊNd4U:fu[,FVvz\L'KuB`qFyN8c;ω*QK:=|NP4'beYp/i% ١ ]yc B/BќH4^\ęթXitRk$@9q.j&q2 , N̂Y PjRs'-FN đrw8'{$IC.RtWHYgRs"W* !IE%T+HFb,/Hٲ ,%􌮸 dQRHX [lrPvcFga%x(@g}LúOf8sm"cpK㴳";hkxXE,agK%gqжI4bH7։%! ׁnyQ$R*%$?y`؉_~{}N(T hpQTN3TiVsH8+MK-qU$D8!SЋFPGڳbL[ 8A鯚T68 lt'},v@C~7˹ZݫϯxN.F8JL!na 0bADXTvNWOW?q0oNW~<] R \uv:W؟~xv8ŵ$>*g|q(?!"phB(3H]^D˜oQj#сqQ(ٴ܇'6ɽR}S`aHٜĿ}uV׵v W}nG!uX9JWV*pNI]JkQU]ֈ(i9JUi !%t:T׷WHLi (vƛHĜt0b`xGhb*gi91ΚgzjuXɶ;.jBR峾*spK˦vgc-qNL(,pBI*plS"E*iYkJ3@NYP%RG5$ 9Μp,6AMl3 jy1.!\ׄs`VXSpqOxp+eLE51R&P Fs(f m y[SwAkyOrApNC'y;).4 .H%'>yWHfYE1hB8p㛅'i_QDmc2B vu iT8EО3&v+ LqE@r*TP8I$j+u,NV7N n[uFIVYߞXeLEF74Fv[ ߭oHG+>7lS߼bgh&6b|%gh㔩p mST[0iMHΈR/&cCi Bgk\18i[J2[b2%ab.E}lp+0E̟$s1X0,)(ci2ͤ?CX)Ƴ$PW:yn/L<{5 T{ܾ\_ow&Kb|lOg/Qn'kg?ɤz׵Y^LM>v8ѕyנ7EgAdp B\"JF$j}{W].R-;VjV $-Hewt%Y#5|10D:(1d|G[ȑ^Dss[Z [P@nh4W[7q=*>P}WO9Jndϊᆢ`%z҆|yo9_u1iTwGzJ^=FzuO-]t,魼y/a>Ѥ?Ry-00.w)J[O RDX|JY.xs&]pW^[Z`"r#nߎ]RlXt" ~=np]dt;o3[rU~:\Va-JbXFpsa团/aa(-$Ys.1SN m#4O%"f+SIJ-9F9@$`؍Ca-֩]|X =ڜ8?LjbiJo(n6}"749G0T7m0^/ ׂ}nvaE1a&㤈i6M_2Mq?_n/W巯 룪uQP'#7t4 x擎 SoýezT̵s'YV%*n=٢%{7+אUpR+2sO{{"[P_ͮ~"RR>5 8D شK}ڙR6S#ʼn籑.Hp+tD<6[v˘w*먠bH}Ӯ+-+C~ϭSNyaEa/uYiMU?'Uv\מAɇ.?.WdO C?OlUӝy|7>IVw_wmu0.3懘ɠ_C޵%\tn2>e M.*]@0C0Kr>|9[U㭃4JfZj)q2i-ʵWuY`QKinj4U"Wi6fpiP4 \BV z`?݁^ \ф[AU3eĐ] 3f͟Fk]8NÀ JZ}(j"ÉozJ=aõuʹZO=EB-biͻnֱS_N Kď&_RRlR(_l(akdwԘnөzM(9͇A<;o9ւn2ӬGɤT}p 2F$hdKS*I׹TT[\ )INrsU.AB?8p.1m5XEY c\RO((SC#9U 0D⥌d,7ߋrS_lq"}QE\W 3'G7}̊ n8?nΊ_O'n|?%vp_~Spj4M_).['g̫?=hdL iTH{i^-< 1#!qYR41YL4Q9H."T> -sZn Q5Vu/ ]tqx36*_214Ƥ$i1jdIV':mB<' NĮjBl?F~Jw^%A{VKWutP]`~.eLmTP#i Bҭj hqh2RMȘsKHeyPc k JF;ZC'GCr-:\>hr,~e Btv&OjyE&"Mq XCLl*7x}jjH)4 f,V| }ul$Ƴ/QڟIiЋhExTÂ-,V-t$.N LKRcQӶ/H2ĭT6|ewǓm^x9'%R<z2k l_tJ)L3BTxœ W)&r2Ő7I"*3C3I 9y $60=u(>js:|X>oaGTLcREISvHR1ɓT$Nqq@9L"k.YA瓫n3ZS g{j?'l'Lm6P>s4\UxU/Y~hAK@-Nύ簟a3]<`~*"4Z0[p=cO]hõDb~ XjȭX"YGk1K=qB3nVPF2F FQ&qhe*:1pnϽ 0x6/Bj"ZlEaԤK5]FLN@l NTkV7l݁Ћu%V٣ %Ur4vYS5^+@-zrEVΔy(̱爂ޕ#bBzpMNO]h(WE}|a;xtyo=SSGp*0Gk3eSDZ\@8*FjM]W ߟ_B2[7;ZL.JC0.ӱM\+fړOLu=H<%ah.O`.W߸ٸw}M4*uפ};̒qg<%:ltrǩOPFY/7ulJ(tPVXh:ZkH$GB:T'!3X u=8;mfM09\v)&͉ݚ_-v"||@J,1`Dj"evN;女У+V>.wRPyM,NhS\tqU<+f>{tY- }$7E §q0Ntoa~ Ts"o)_nYs'3=Ff;&(?ڵ J)I"tNybcih.M+[k))SCu'3\"$Xʼn?0ΌqX@tZ=w_T Tu*ؑl.kVv '-_̿ȬQᮬ@>%Ini/lC L ej8G@I.\b \Ʃʄ0\h(ؔY!5IˆٌroT2KQYJOMD .3omFOHY逖ưpfAwy6a$~oXa ,˻G1 hBўX1ChZ㑕=Rv˘r |h}+{F|ҸO]D֭z% ,5so5h>a?N7V0r?~%ǾC +$Gqy4?OE4_"z\JzCd+X2|N~M~k衍# 9leGԷUSLD'4"fMLBӐ)hsi/`=3^"F@"^ z%y?Ci( SJ^rtLJ_?6dvesԞ+<O4Sar>Y($Ib4rYf#;YQ:5lƓL'a4CSd,fV9gV( F0B:}~bac;^Z'˞*'slev9o헭y+ǵQ-q? `6/o%3XI7\vAJ͈G[;f&TJJ_VNrM@iFH¢F4(f 4g[qC> ek,y U#3JJECZ06!N 0=¦J9oFS>bgQ|S.ꢽbV85rP^@f" gxÙMӟ%#}\-IGcPut(晅[qf`:TJ5\.4&ص+N)y: i _vsM9 l.),72o%i]R pͻ5"\\Khكi<+%ʝg@K$D֫|iKl7zmc)_;NI#^$*(=d qẲ0\]Zp6  .?"+[VuE.%\c B5ӋcƺI'K2% pXr}+Jbu17vL.*h񠌧$GΤ*oB&BӀ3#F2𧱺@Iwv@I>jp Dp**@5{1F-eH b= r?xkV;gO4W߳D\޲zp؛(^LEt OT4. :ĞoV{ͦ[ 1gi\YXd "ɱ5(Wйь"{- qRKH- AKM<=7%I4iI[|usO=]=?k#F<^\.eAL\d#R7>iL}>3WwY߆xEXr jĻ8\iB&ej2 ``Ric$4#6!c4RXE7LϘZ?P|'Vx S85I0SmKkT>Vh>ɷDT(r3Y φ' [X#C V^5<(W >u8xD}(Y*0wu b.ٛZ`d]:lUNDGwvubI)so}Gi.!UWZ3|b]yE*8&R!xO1TD18R 4ې.&:Sh0ph l+I7HkG܇giq9GoР6͈lˎ٫|wb"-fizv>$YEYOdC?Ɇ~66q-I-PsA@K qltQ ]5L)I&I6*tM8n,7#30F9ßpWo%Im^V]/,@Ë%pW Yp}HZ%6RVSd;(,`i].ՔD-I= A0Lķ*BQɅ]o6Z3a371i"Ħ4P1[ j+iAxӠO,'eq4g>iIT\;3g$ՠP9רQqb 2 N'!#a1D:YY$s~E .#$tY )G/<="\%mfK&ҹr#CR2ht)C"ד!n^"HGie:is5&_)E\rی*yn]5&RRy-Eq4b\kd`Z@-}J8<.t]^Cz#DF%KG_1[wJ'i ų d(},Q6ɏ׽^7|ɝGalͫLrB[l.& [eL{pKa(-:ldԜ4N\bRyi7['/9)&4/Xzyq2F]^[$Vnx44kv78c NOP^Hdz YTi؆y DHKc68S׿}9(9-I:=o р|C3G:ȟkȗ\ӄҚ&GE*C\ЮH̖2Lcȕsc: E|/=Z0Όe6mw B k-X@5PkZ :, M%of:M=ŷ%%-CuQJ2I|&9k4ɭtYHO+-2ٖwin2R2kXT~8mѣ(W JgWJQ#L;6u@}0KQM[դTǃpNG 0~^AbLo(%^>UtQzɒHIZ\Pv?OI>?OlYXN1HQ *+'}R3.) t޹ ="@R#5-WΖ+_ޱBnD s$S$:DoF|c 4 +5 +.y Ϯnο^]>Ȓ..giqg/yV݅ g?kyv]!N?HclQ ~.EVg%Ge#SMjVOf{9Dȑve."M/5jڰ\uDEѿ{Ւo#Z]s0LOBT² >|* QeL {d~DHP(9!ΞE1ӓڛ1y?JditsPF|<5);jpp&=0rR~ Ҁ( Ӑ>z'9z'k;1i8L@4!V[y 4QlSEq- !*@8EU9W/ٜ`Ƙo.Z`Q P.|OB J>[ɛG?>m'Z ᄑʳyOE釟fGtk%Lҋc}myj%E+w5Sׅ8BnuˬȺ{bw -4w_6sHIGs3DŽZmUW2&3ǝL!" ԁ{dE]mhxi=(dյzoga"FE L9ӀU_kb{y_Ǻu\ud1eaY7Yy~}pn暌0{HܝW~}1DbƬg/3AZl.FoZl"gV <$á{H=8tZO[cnky] `3<~~H$*O5wq*8Q22ta{Qac+Be%Cd4 >pb6W=* ZhV\pǥ=-W(^Z1iX=h^K:VIR'yZ" ^V Ya{,Oإ9:yzj'[GW׿0hgo VyV rhhVUz]"6ޔz?wJ_ן:3>U =ucO7eӥx)<\Տ8;dJzoY'>T-lvmBwMki+~w-Q|!?S 4_m JV[wlu^g'Ѩ)w,kw׌sZ(hjlLVHdCVrbuWA4[-Y`+f6C825o}ŋӭw[Fr?+n*޽|!.&[ٗ볾UCI ~U'Y_||no|Xݽyusu˴S_n}\[}&VWwoL :R߽MM6cl_7ԩH.> ?\w4d'^-q`;a!ힻ2(jވS tz}2q%)]{w6JG$U(cCt@c'gHdG]QNcå`pkpKVX8 Vr~y#@[8bIy[o5o7ٗR0i$+ CT_nwBRH%;rw]}fC 5XY6d;;\t`;ӵ6]D GvfAZSLyw\[ ӵ GaqǃCbDz8ĚAzH>D4>Bl?3L`kQ QbR֧D_pl18Ԟ(L 8H:ȹ7zFS]s'Ptg/}}w{,o>N=B]҈u&.E;[Z178b~vwЏzޒ>TѓX->/5}~ԣ3Gb( *GcZYg,_>P :aBVT-PD/lp1p^df9fYRz9?7fYy<ىĢP$71p|ZAŲ/7e&|R$|PɞN WkHPvh?`J%AIgcUcZTհ-^XHS,*  i߸O.SɡOY ɳ9OI:x"ɓE-Z;iQg*:;YɐѩHϾu>)ՒHCh-43BOko:YmE9{~grVSI:ˠȃA#,ZkwB6qZQ݄hp_/EV:clkX*4B[3fG RS=먍 6fm\`B9] :DoPCǟ*pc?.?0Ȓs7΂<0Jk8Bc8ͫVjR pZIUUcIk]PڠcJb ԩ~Wvn7qqœthuMdKPJm%Wc@ (86FT] ?]ZABІ*" nfO5B!ڄexDNCH[WKVBP[ 8>Or[ړL$S3 i 3]cȝIj&\~!O'Lx^FXҚQF,W$m;W:c[d$U08O} Q/SzB^C06jHz ;*fd-bxI&Pk*al*Vm"OX4 2"+ZZwNJ(C:z^j9U>B%PE%9VU?k%X6޳avtj U#j U+/۪SVx!w֢}A9LJ־Ǧk{n HIXY_-d rs9FL9ԍ\YhMRAkú&W I8bwUMe⬖±_m/%ZAĠY~G[_#+)?\5Z'?AX񪆼\roӇ Lwx.ƿ-PRy|TY`KFv^muf4lGϺ韼S(&Nui'|SrTB}S=E5v#ZIhOPQӨD94s76c}z|s!h}|n/{|ExƧ#h|z)oYީFdYwۇKB=]4vJ&.KEttky,Z\ɘZN!\tOΞlpgO-- Nm ;%#WI~f^|u%~~[ǿUwoG`]1 #hʺ+g}ֳ1!)gSQQqsG˹uT5?h.Gc(`& g8tpOZS}eHX~ {Yw뤘o??a;b1׊3N*qNЏQ9xUK?GMs Ad_,ˣA][F.>J(}G3~eZeB*=MV)8\vD~p!f%kW>g1-{rH{~w%vqimZ9L.Ak\"XQof1Lt<__i<+NO,A䢘ٖ3Gb$ 7 pf+3h>/~/nM}U $i`TO,I0L~X2j/Ĝ", >VcH.ī̱@"+tB%RZ(JO(ӕ&c&"r.K({XF7D.|ȿFΒN78zr)g6b$< 臕9ʔ$GK_Mǐ7 <\$@jg80gk(n3 h3yB鑇H7`AqS^߇[#2ŒʑrG!pSD ړvC LC8zFl˳k?w7Yw|!+7Bx)n =UMO0Z'2]a2ǭ6?HFٍ[Ẃ cjm9͙Vj!21%;bh! Rh@ =4 Mz] 2 :l4HV5ur.@*&t']'r]ȟtQNNФ@S͒!S&]oDqNIhĎS 0K%QEUч헷5&t]Ys7+ =nDpd zv؇cc-Ґ-[<}dH"Q/LucW8#zWM1~ؤs6hnG{:1sݷHw}裃1m΂up {[Yz4F;zAJieA͛o'+e\&FݯaH"M}w?6 BLWoSa M%7H#QSm;*n ҭ{?}083J9IȎWwp;4X-oFaFf^K Yc<.bެc/gmtHg>cc G3mQyUV_Ǩ4Z02EZߖ6Z(~>1īp}޼ggMO;ӻ_!/W4÷iRTN:5s dI?q6HLh|:9u N9 t]Q1HHʕ ]Qht=r1&zg'kH&iUGdA[ ]#6Ibۯ M[>1 J 1 TqWSyۿLo&Y(nr=)[IoAmf9B:ACZn.'& tJX 51(l&cB;CE6<OʞV|P;zSu"a2k z'vFSsQ4A(${@;qC0G/G1/C8_W@T^h[m=߸(ASl rګ$80c [wBnғLPнϬBrbbR31Sf2'' gX:R^: [,2migׅq3#gu7:%iu~Y/i 9(T׏T('}*VJZ$61iQ+8EeP4hXPM,ZH)Foka{ӎ,y"AVDk5 ?v- cŶaĹ){l[ڭ?&0uP$aSsN齃^OCWaowW=fBi|(EXx۝$1IT\yPT2W`_a~H-3dǻ'vë́c.>9{qqպأXz;!(oV$%Ǭr}s|esM fg֒$Mט/̓M5K4Hٜ v"h(1ݞZ/>}Vg߻w7?1}>{W{~i*|gygpw== mզW^:)7В׊i?Esbn`k41*2r%mo~7g馾8kQ3\}\ՑdbyqUl!li6 ) b(Q#-Pϐ?!cuޝm zG ѹ!xdS6v@ji nK|:l>qm?#-1|3zۏwķc[Z%ș|$5N{nH2`9SRqxj?.ڌ/u*WU4Iz' =( 0A4_=xg̀ #CZ 3Zth( Lz$1b("M+ XY/cEZ1Is8ܺK/_kq/:i|h w` ~PEl-' 2M?w~fV}!!1q H06eՈJ6?Aر8q4 [>GbN.ZyĜ:zxmo&v/i{>?tC˼ `M%7ZYXFcU+t6e ժdK艦-;`V3eУbCC?+Ya]Wfiص80쳻օؽ|rOz6rwa J?'ۼC+󼵏CX}. ޷jU1$Mu&K @h_ev.W˧mvCF W1gxF+TuoX=>fǨIPd_|jrVr50[VQ(@+~Gh7NQd/ D+bz{z& 9Ѕ(;@O΅G~٘q!7FxJ5s4lCR(]VgBٍVkK JI'R)ݽVǩLv u$4L)-qTglF:K}s^,3dqvRk͹.霹r.g04 #;t>0Fy5?\ƀ54' 7+2oy+)_ݶRKµfwED,<=c m S}]1V 2~\%{߱>Q9zVV+/:7P3pK/JZX!O6/߱oG3eK2tl}Cxm_K~i{[7 Jye g"=#GSDZKBPzf(,PF!/NsGB:ҼF9'@`-=O;^z&! TNP*6Ҙɔ2RەRJ'hO6K?)C68 E.8tfhRYwf1f c Y,NǦmMق \j-e=;֔F }$OHW|';CYvz@鑓rK"r"1s;%OG.-eXb v]nݱfX3tؙdVeY2M$.GRPWxf%:ق PKEHP!Ju u`U ӥ+Ed@nN0=sF𤢕I s2W0""ę%E8JL_* ‰lx p7g)I}Oc' Vt'dcH7B5Kk/z\2fʂ_"$-'k@bM1*/͠|MHs7[6K\Ӂ+Ӡ$f]rzjwH}"D + 4T0Tu7FQ|IВ8c̺lOQƘ˾h&t9"7{+3g:b:>si7^ǒfcW_.GQ@u_ gэ'95q3lT6hNgw FFo S 0'q`g?ٻFn$WJ×b aqe%eGJՖ[nNj5YObμ )\owؠ/qcρQՃ6w|~"0|DT[,AKxuCa E}WϫGꢲ9q|1i60Kt3[cZ}[;3 ;e4;5;N {baJJ='u |1$w_,7# _8bGߖ(䗑#9o%à>"PǩWL/ \v?_&PdB\UJ16Zro;-ɽ\/(E:%HA{t%JTɽlP*߾ȁ@AELܣ%Qh-oN \ Qn ΐwp {f"A00BِSr]'0v"6+04/2i)^6+lZ{ N|H~6oAV4w$uAHfϳ t SW'\ EL JC e{?Ku₍x|P1(`R_[:A h~^/$׍{uXNȥ6E?jJ>-\۴p( }xW>]e͇R\M1N$:Or;wuyyY7u^ٚ1˸w2Z3΅`, A!H|B&QfkU5[h "ԗ { 9Rima'r%`P&+4* 'HkFk,NZ!ڨQl`DM iZ%&  WĜ5(bB0N2ia:%/ZTRƽ4yM<Nz;,NrzRA[gp)-fSxʞ9B18.Er;K]6`N.R@!-A7Vn1u_-fϸwEGa]i:؏~{c?}xjl|e~.F.QFX?UAz; _!պ1 9p/nnӷ^0(z͓FX`/U^6^:]\)C1\om|)ê8 ht{Sͪ-$7W|ѳEY:@T?Ma~V Og7ׅY 4 g cſ{;qm3cܺgn!P=4p[9ѹq~9ʰyo q5c[Q/8 G6~kwewΫO`޸F)7#lb|F=zI8RaaUtO \tƥYpߚŇ*ФC"d=CozjFٞ S $$=((PN< |=sF[-;NŘР9"/7 $=t:y:!uFFohJ(!\Rс:Z@<7cLb鄝 OS}FNhOs3Sn7d[UG @r>Kja\tXYfwB"Je4qNӀQ(a1TYɄ !QN5\?]x,$X |Hd?ǫûY$} s{cJ>idcĄ7miE|wBZ?}Lpjtؤ$ n~k ß {O$F>? &erbW6c@49l)tn1>7WϮlv}gH{|\U"sL\OfJjK)-LW&I [_T>ؙ<@;K~^zw,[ gĶd=Iڃ-lcr˷En[?\Gg+^Mr/zyL>JF$Oe9T;n>./w{݋"wW.ŗJSN&2|g\ &%"u$J 4xNPF@ LxfK1ElHt7ARg><,бe5 J|8VJNyk00!g=#22ӎs&]ܣ^2(!;PB#0^M̢qoӇ=}\:NQI)A II2sEȃSNpNF|{B\E.{y?ouI:b|orӏa²J]_,"l:2gosԋck;n!81 N% 1CAMxqm,p$ K3!t%LF9)2isI"CǤgRPIX~+v4ȒѧRHֱE@Bjs冀cn5SEMidD JBFx )>Bj?LVq96j!^)HYzf"3IhkSp*BH6LФ!ܲqϽsL͢eԀ*⅐\H6`z_WNʎ(ᠡ,6T[VAԞy\A_Ur`H'P#G]dp"F. aO0ga2Օ64ݺDĪ|zB5U_*XN'wr9qw^QMZJ*qrPAҚzK/(?+{SY X+nwB@]z~ |UKmW7CRޥ#mx7k+>]Pq@s 8mfz{H:T`=Hc$<0A|*2^r @lT`qӣ[TT7"*ۀ$T*RK 71x!Ma%Q (hrO7�Gr#YKi#۬UiHMR?A9Nh06P.8 <Nb⣏iZ"5h6fzJtIc#]fˢki~`ݷP1& 9vE`eFu(/4Jb@4gtFy讎^S "jDxS1v782$Ěsmٖ Yg=iu3WMWnݪ_[%m߮^֥P|U'r~vmxf;A/߬s]XvM%7*.~]LU9pecU#eS+a㽇4.$̭xU4u%=zFt֞$䙋hLTQRڭѩv;a'"?v&T!!\Dkd )| q[)EN*1(1:iF`jpVKhvABVɔTicgNXzV5ʰ?V(h~1W3홣UlpEBY5:p@"hPGi/+f5GzI` "-06pPqWZ+Պ0gbh>ΔuVbqX+sIF= sapF?Nȱ)CZ${p ݠ̺y{Z J6ctÞsx,)TT- >^cGi QM[7P7LcѝaN':̝np pB8=>28s>J/;O}gkN4[}[ N^8y-W`/C- p !Z"@"QYEQ`Ydjm@w]zg.52ORj7 WAԎEOI&T!!\D+dJE'h7ltS6agq_Khq$䙋LB_fs&bYLĒ7d"RZF̵1[nK? C6zEXr@k Dxaݳbmjq>h^(\tJ.r%)czqO^ &, CӺ9%ēE#aōU,TXAX^FxMu,KR@W$O9tVP.j1_ [U6DIiVakHh*Mjk|jѐMN FWMkV &$$CJ,jNyZx鋊DPٻ7$W~G%@KX ,3/k4emY^H],VlŬˌ̈0g(ZF%Il(XF=Sy/e,䴳@,5V vzNYuQFʫg{cAS?G0@%Z-!WFP+pcND!qNbbQAM,'qЎCN]!(D]\c\f,..4&jK8JrzT 5av|vcN@TPA518͢v̄C.$RV~\Ѽv,pla|h6 MaP9kR@@U hD<)\Iw$5ұ%r|N="Z &%e!@lBA $9A'5!WƁQk*P=ą\]!Rp=u1TWP<^4<ؤ%Vk{Npm@С4b#րNrRZi9r4%.l!gx9jH=lP.EāQ`\kύÏL.#ssVP*P@ 8B0r/S{AxHT#n'Gp **>;z:Ş.߻Yk <-`ΠVIwۧ_m1>58~->B[NVU A? Ɠ還kYlnst?P њL3JsjHƱNZs~a-l~,֦2u Wܹ"3J0ߗ9vCq]9`ɟY. xlz߾\=,dm|s8`NϿ)z 3τv?<^=/8n`?Y:hgGƌY;ɷmOw|e" {Ç8dvy$)LwIIYΠyYSNkH]1ߏ 5h+̓DOHВ):Dc(N)ѼKJʲK; m!f[Eur(Y9GKK1q R?)%p SJ$=ʱGo@:Z?8EQq1KF߆ MӢaT7ޅQ}.D0gK.`+/̄>kJ%o9Y%I-WkieH-] tB6ܓn_Dp|MZۧ?D?F6w9l$ի}>Sc~i*c(˳r mIVȗ!w<1q 1"Q6a8=J+ut-KG>4^xu~y8.qյet5d(kuq wV@]#YOl^Ӛ.at!G RvLe6IvWhHeX+k ggV.R2y2rh}rWr,ߥ/E 9bT0TELkO!Mq}:>>tnǣ2ɏԪhxtĭZPBTy:}Fs>$M5Z-> A~vӻ_[*.Զt )o rb!_f[hPŁQ)O qVIɕ$= ?8ݩE6۲"jD5b)Wj+Wֺ_I_ؙ5ĘCEN~}(ߛ-bf;Ύ7avQ2>7Yΐ Î鍎ϊM9,Z\k`(΢-\.@Е2V[ޱe˚U֯ջs}<o>ߙ0AQ r*bj;Vx\1 ^RD[g^Uf鐳tcWDz6Hy.Vt9 'V;@Ŵ>m(hu(NL!%F,˶K1}rD\TCyHgZ(b$p2\C~lB,g.FtRKGW~UQ e` +t@5g1zA!:y 4:nK)}C'TJ hH2MPY/mvnN'9QBhy=.ЛI2)ҰQN衠r%UC"d [M'ш-͡7cQpPRQh(bd̰ԦT "DtQkU3dk%(_,3 c^/ ffacB4 ~ژyc:oL^7&-{_Iͬ!\YI$X \ $ox 9O2n)F&|x|? Kum_za q͞>q6\0s0k/<~B=! GCߠBpA*-:Z啽 deo;8V&-UhJ]bRr-KHPڐ'U\A'Rϖ$ YNj$e7:߫j^-nxvCFi@$q:]&/d;k-qQwB 4'TqWv% @M#ܛ;Eȥs*Z*KwT?'ss7r*T콰f7#$Ν. ), 0$Fx'uZ譥k#a,,r.AL'j V_NjZ˯k7x{K07STF%Y.k7Wv<(F"PY~.o0;L͜"dvv#$Zժw@W]ҡ0I\׎Z6:Qxml`酤';`v2SS]!2/dt2B'+ -z-+4W3 >6< C2+NxRW9ƘM"^ml~=FU J#m,g;Η!YIsER!Ā}+F<(ʂ{P}ܼSBt(t2޷͉}爹]4|VOik4r\w50z|o;E@K>E{3"p$㝈0:6!U!u".@ (̱taO P'-|uY _׵p=q&b%Q8IhnMDy6Z\ ^#)Q"-tE2]}(fHs@jmm/ۗ,%1m[;o҂m>YxTIʤ(fD"0/r:*/D96U}F'a) sIn#P)i MPr MQTİҜT "L(7e.SZ m*ڛjf.әj0K-yO!@g2) (¡@5> /CeOR[/nץZH\Бَw7mq !,SPK3ЌGA-\Y_: z#A(Bvap4 {!ȷ鬳NV%FGKaKmuBϚU]KIGj:ݫ79k{+DnB(0 TyǏe,XH߀*!>  bD9'Tq F+hB|*gH&Z(Te"8௕{T36CS Cp]\O$0< 2* IT-*#HFǑ&.TS$= >̱˱{u&yFIy<30`W{a@A/kWWy 0K;adgW>~8uϽM_yt[ood<q}.2癙lomҼ8IC%yY-u=z DŽ̅%j2R 4fƌ#L`GQήԜ'}g90cf5\HG.S*2`eBN3Ar$Ypfʧh初(yfcõ$-jY%@mnpKo/ _kh,†.B]UlpOMU|ewP`Yz.6,NR1* d3^{?Jf޵S'KD`8YVK,*km%"|H̪0 |sVT$<ۨaX8#E׭)[kA5Kq5#[8LQYW1Mf->?ɽ,͡gVP]ocPeCCD!3HJ*[Lr$kA8V0YcV6%S*#$v_L_#)!%ig5X,+atia>K13*(Ljq"* 3;JӺVS}؜ YYWGĞs``F#'GX96,?D۹zD<{r9]PM=W"kTE|a=8 μHJzKi}g@'j dzuU2;48]DsL$O;4xDMƫb)ܾkPups=g'?uh'j^7B'mWyE ӓ^W*}laJ/HD's$^ې 81,Sñka, zmQ * ʝFݸ8Kb"9Z@_`ixpYu4J;jV91`V.w͝X9T1sM)jYxuuu]v;ͻV =wPpφ"u>Cvf1"jf/!3Ámbא֝NqR7Ω49dzC`H=v"EgB4uS6dB.t|I]e > >cm8^}^f$'A͉GCDwG=`|,Ojet.N 5ׂM=JTMND{Mz?ψ>jv&sŽ"#ò7c| sD*|-5!UT C%h+S@cPwsT2x-_]`Y1胷2hꨳnxuݫ)] ;::EWm#hSUna+d0$x!fB0aoc'H wa*hw.j϶_ubo؅l% Vvcr2waCj 7m7}پ_Yq8g]r;SZu>[/e6qxlqڢ5۝bcESf80 :$F6R> Ddע[/h B dPm5`ww6uDT %]rB }Ae_F1AZM$۝P&q &SUsJAHkՙk;d$0|Mi.!>e2|-sIkxe>}Ç~خBs=ȹ"-*<Ӭg=)~ч/KWۚ< y>'28V!:c" 4У H4qa'BS?g#(RuCQ%{Q(Dca,ޏlQo»Y*}md=q 01O깵֪[zng0TMOt8< Z1F8@̏Y(FP(UOBnV+l^ؿ/nk!my&yh$6w&ҿf;o~A ]2PR+*&9S,i ^BAP cit Џ1Op`#01_)"j Sa0Jk1ڨŌF1:ukn@"bR0D!A)Vu#t1\~ dPPLQPb4d0 E,8jrf?4Q 5Ek 99&Xv) x&vy;7f~ݬϝ&Xg&Iڋ 7$SS$˜>}~Cn^?~>\Nje>uAD|y?̾Zovd YC!t|g͏l}~w-Bʑ_51Qjq~_?bOxg (hSdK+929# Xn%_vfd;_ףݗ[Z 02Ie*t_yN&W0h1f w DuZq2W(9I"BsGqgbD*-%SAJ#_c_UzRx Fz*(9ƒ|^7+w6ͧWr3ޒܸ\vEn(7O\Wb1"tK[f&ϛ|Gjxjл/6ĥ'4)ZMG7II7T)"DN(< K2o" ) ٘S08{`c jg/NnHZmVؓyޙ5T^?!t>NTuqNQPm UU Si _b~WDptX :w}]R;gSV$`e.BaU m: O)4m*"Ql" oiv'af\N©SM"TM6C#8; ,W[sNNU=Co-w6VdoJ~i”ͼCՔvwWO"OiϷp7Vf}|53}y;eZa_ޙwQOec`{}i+=lt $jVs$~ wФMrA+MV4nuIjEd_[O--:6WG9;qhgI1`.>ʙqK[FѮt`+ޫMTf)?ܶ6ζomcZ蔽LMٙőrs o%>WZ9mZmM|ӛʝKHC HRFjA\#Zڂ6JC͡Z!8y^ij pV~>X_m3/v7۽%5mh$Jк"(eW7N`/JՌwGFX~-KeY-Kic)`e~v2JRԥ+iv5s}yózklyyKԵ#wfGDS+%^?apk?!ʹ)*l +4zmY=z=ߎ9ddh[&V*EEH(+=)r?<+Y{*veL̘h)e9\t|"⃙# 6:x7_z-?l5I{jI#d,tHLEfHLǁXFHD9{jO*e,Ɉzl ;1qƉ%B(×o`8t/r8r}IOrhy^խ,(=vu"VY_\Z'V/-7An v:mQKh d 0nw^ruPe1{ǩmUQhϪisωr""^o d1x{X2ke-V5AaBBbg&OhZ)V-9u)5bD(\->YEL$9'2?0OB~|2υ樮%庚O _O.MoF$ۥeezL3feEJ] Y5Úԙa@=ULEҊAbЇEj.PP S' !\\2C/t6}\QS inXBaחƸjנzNAWZ򘎠ATi%k*wC5N,%muREؒj_ۥZ%]ٓ Vh_M`{ֱp:ݽD5ETv`a$TCEZIMsx9%ާ }> "Acc Gs BNJKshj|cUվBL"9~! o!BEdx1<81tc|_q?b"_!H$Q2B&*}M 4]vCn^?^6_ɓ^Lч#~ "ƈ[G}jųzv>GVN==sqynXV<٠EĦw_fʕ0߄woSCٻF$W~bJVag>FUVD"o9n"T oOۉgP'(;[r&Mf+(TboS$B40 D @V<2|QBkI-=Lp.W.IfPƈH&Jm)oI[ſ^L̺ řQÅ; %A(.n8+*5[B:F#3 ;L(xpE_4@|>^%~5Zk[ Е牊HL}ZqޚMGEagEvyE15t88r@?CNPO6 _;_`x|SDZygdz5ҞKg|;pL~4)~?7?<"5bC$ǟnb<zU&ގr`r-ARSw5D4>=9OiORxg7+zSbSض=UuIgXO+ 3b>Wa}ݨ\ i|*_N.>>eLY&)Bes4A*1+J ִAT?ܐ [R A!X1=(HpU=@Uƺ˽+GI>P Qx6tбc;^`lb2afX paL*>QFa.Nyc,`n"d[7€v7vH] 8pjYAx%5UQR3ZN}>urȴ\-|8Xg \.̸\uq+opaՍSX-쓖&1p{I_)t(# kڔO>9d_ lSyo%*Ț,on۔f/sXb;qFfI$@U )w(ʪ(/el^ g۳y|Qks㡼{ 񝍸5׮®Ƨ5415&qFB$e $3\:10R.bk "n#_Qx)}Իζ@^Uޠ+AFM=õu ӓ@r>[< gK>fBHٯB6@!0!G4[I5arAv@TA^NE=̲:/:[gx=Lˮ$UvkO0P/َӭ[q1jvƨ$,J(mntYO>e>(6~6N `=zp/U͕&mnR֜*U/d\Khj%T\}G@֍:WEz{"{ -#A[D/czh^F{J7cqFo54U(wny%ڕ~~KM^bic~ԘQ2SZ{ s^ [;CyXIx}}\[^RTZ>HjLnz ͡~fvOmoFZU~P|uK;FX3unK;3nKfNp:7pCM$GgZN8Ik}|I1_D'A_NRw4ktl'>GFź)ߏz6G'9>:QWw'Z7ֹs_A>GQYͺYt`k~hp&ZW4Nnp ~2$m/t <.FJHq\$gZǟv~df|txȡ&1K%m"f$zݤ^UVP=Kh5Gҥ;I}0ڣ oCݭ% ^8T-q/RO#~zAK8W|2[?'y}e{}20 ٣b/|f j3=RGhHmnXZLo~G+΍P=' %7F_xp s@BS~GyëӟAc\m@q3ğ@ bF$XfVվC;I핅m6  3kPq.Kם4L~Qb־[ 6?}| 㩛5Ǎ$=8Jn;Wgly״8)z=sM;pC7UqMU5pX߂ *< Oǖ=dR v&x]'u5ӭx\j4 j$ֲgtS˦)wW Z%xVI5fDRa\SKkIa3~v]%xtyߺjvmQ(ywq8HQ\R`4s]Xh"xEa.2i4U<,l9JQ+ANe:F!в圊=Mj/n7JMcM*%U2QC],hw)GV4 H wԈ4]WhjH?o%͸VPwFUV)*E|7]\ڊlPZjWrҜ3-E[VO?eUIQڄTA{s^E`\qQ^fGA6`uzK| DL1Q$vĢ LZmTИae 'ZS&8=Á4ZnQGvb0|76 v|-o~{7>,< ;_޹›p uE—Zep3!hMㄐDʔdJce&&L41f@vaa5η "D%*Hm nc 5|ٞ\tւLXS4$iTQj` F 7$ÖafB,`,nN# doOU݆ .}?ۑT(Gti*d; [d2ƥNf31xTQc3` qM)lzB{c sc=?+"0Ic02K N JSZRҌRSF3LNeb&܋`D5aF,K(I2iR F1)ϗ4܂Bĺ_X*n 2D$ HG ˥lw{`1]͍ƳQ%\.*atSǮ,O݊ [yնA׫vs[]  no d3r`MAjj 8-bvrq]1i/B߱oܗSabٛ2pKWK\Q]ٮ+vDSch M"S4D0z,onA6EeCFxO[ ^JޭڽEBn5XDlJ˼[H sڪ-(8H H&JtL2gETaZBJKs¸jb36 Xi\Y>Z^LzxwT:h'^I hMq{4%;nN;xYTּ[ޭ MQOW VǮ}E3Wc;Y!ٝM?A|x~=%W_x+zfS• 5xfDfn㥍/YN>*HJer $eP;y5kW/~OCD炈D? TI%-R[t %dG.{U|} $3['C#&hwgi!|pqsFO+ pAw<|;|%a*Vb'>/zgŋ~h'ˮ59$H#Q'AF"!D#J XHfj*n(x@ aFPfZV%DII,B/k"(%\G!Y?&Dm BxzFfRQM"x]T AJKg t[*r}3ysh}$VרT9'ȤSUB 5r4aEKZj=%ST$7ҽ\z i["#(Jiä,ggck1T_\~SSouWh0n-gUraDQe'4F9v%LmS6?(N1|eQP~?}K~^/Fo#ExG׿`n(")QQbL1l(&$a(AssmyY͈=u=m/ήj/\uRq/vq')>oK1^T_R%@/]H*WM._BT PWhe4]z2P$x,d ms-s.5^K8)hs׏M춅+$=':`f1ΏF2{u }X ry a5j9[-oor y?|V6/\y@:ǃ v&0`7?pyYLϷ! r"mSOWWr;|`mg!{퉟R;1#EpG:U&BHE41J浰b5^&̰ jBgdQ[Wk%o kK&k!(@%Һ|.ɧ?~;mu4+nv+NV0%Xwm͍g2wO2KT zh,b)mȯJ8cyJAP=2f$ʽܛ2Q?D߭TL2Ji5$vR31chl3-2D B)@UQMbTqm2eY BhnJHͅ @7FI ]KcUjeM(]:XB=*~.w1V?qXٟ_v[gnuv[gnuVm]`ו*l%5̸c-hƴΚz8,bۈ4Ωtϻ\_߯&7.9X^C_t5YGSk4ldž;({S]?J[VO uM:%o;T[Rid3I!#c߁ y>-y[g>K(Ù:őX‘]֔P j{y)@q:k{] ׅ/3A9s3A9+oP5z 97X#a8xi0N&3kI3FŲD,Io C_)ҊGs2ZtejRa}CN1&KDbm2e@V輦0f#9\ 7<`De*߶!!C$N ]O!Т в R<|%%8Uv1&3gvIFd)3¥v&@1UC4tA1Tq秙xH~qN[9.[?=I0 /^\Ro/_(NUrMۏ~wOypTLWv[,Wkr{e/N-g/W߬K<>'#@oOdYS.D:P]/8kԪXFPbj[f-נLU ͎FBN]KU-UUznA>P"Ձ[GyQd+@SDg@!WC+ZMLCMVY~du =}e8* Ϸw50\yR4>@!)lc/[nPiQK($\ KR^ CK$. )Pα<魲wzMf}2 9 B3IPbLRЮmnyo~E>r ccGFmT_coGkmz=ܻSрBSQ<;h˩g3Y.iЖҙ͙BxH%!$5&0k#Fp2Z zKa\$S s!h5m|B۹Z-fDLf+qhBótJQrQ#P9sep&ױtܷ Llu^RgIuhw~pAcf;O&<#<YzTx<萌t6)ΆTOX'}ϣoIxMo]'B[@~4"m63J ?VJDK2gn A8+r^ %P'EdlOH &c/ Kq S)M8޾⣏|QJ5MyӔǜQi 2CRu8f&CT 39%fQJNб w?݉%  x@VN:SAnjp(}MoF)K)kgnWS*{ͥYZ6 qRg_3)n„PC'}z~IsW&[To<xc\*U$ZAc!-(^;xzxg:o )h)\ٽ#FuO-}BE |,* \= Hhg RqQxx:pB x^-fy cvʡDm av h>(xfJ>ŘKNOx]& *U b|z ϰI*Ld|-n&B;G/ Sܠ g.K{  Ǭ;#@|!#"s}l]3tĘ*tYh1hHj3_r`l$:>tߠ"?a m"9n;7,Lp[* ",nCZ$LKцQ;W.+ N|w벵$X<7׭%dToT|hrOi ]eVDobM< Lǂg~ ;#O>C׆j*#m:5T1gbͰS#߯VOix~hy({|p*N р<9xj1E]dqIvp};$t ,剭öri⼴.DwZt"W߽X]Z,HJ~|O‘-'@ @Rg UW?7AMF/Yvsg0-ؗ[.$*m̅=|;UΣB @R>x/(Iƒl\p aCnY9k{ڞ :A Kulmz9äqWf Ȃ*4x vK῾M˙P9[iǷNN!ݤتg1Q$zqt70`&hY?lO4ՂK{(Zy\D}tRM8hغ1v|f֣6Gooղ]ޮxv]sX,\.e5xmZ*+댾&*K9aVC'816e^xԤ?V2Zى=^]^o.5A!`J^;RwfX[pS,dgxEڱ(=ݧ; ǒpimhF $U$= :Rnb{\AuF`-S į΃7~unv$xÅnQ*p+ON~1ZրQ%9p͕\, ^ͩ׵!85zR$ߺ R. .sJQ[aI %lB"⭚g aG2n?;BL%BۆQ dr#`1M(=Z遡ҌiegHdH+vSʠv>>6^_ަ{ reƖ[k?읣CC'Os`cWƷ)wm~d yvON~1:h¨$9G X=m2CƢ y9'|u%C{ ֑y#CbB{}f^_;[\~lUwk)ywv$<Hjzyw7RqIoVA#r1ĖXWb"%;c=VիEϛ; p~Q3N’ik(p:<뀘q6V 6[%)/iw*$9m^@*3<{Gm,^#ӎl;'pC\LOڝJ?~4۞>AR4 :$*>^j.5Up̺Li٢J5NÛRtyW'~N6q)dF],/O­ָqM'ebNi>qgj OWc .-Yw~\GemjDƔJT-&m@E)VJ[յv r!_)DLAh v3JB]B.[Qqiz1V뎉@R8@ ےP`ݕ OJ@d kR2}]bݍ.EgX ޙ] J )X4fGLUr hcZ p2vuJcfU) fvo"ibc餓;k`ýyX #2J&^BT4~@ [%YŢ+.s}[5PRFkknzbet{8z(H@cxW'PYJ]H oT%vFK2/OB#8؟`W[I[E󕻋>=Jŭw}bQVў\\^ʤ}]. ʉ҇怈,6FTZͫ3EajQ)\R[ w7_,gƈv"UTd&2Ecڨ*r4G^dYn$1 kp3Շw8K Rg8pyO%΋V{)2j_ e=͞rKgU ]L^" Vu3҂gB]u6|:W)4Mi.4"n}b!UI *mλF;T|Wuo7w M_tsOl~dU'ef̚5s3kVYjaj2W!0HffL 'X/4År8b4|ϻ+ݳoQnW=}d}#"Z77WM]GmX4RlCeA}[g\MErP(#49M1DZyIZRHv,v}v}S`$zOS-,ΐ兌,AEBp\D(qF!P5DQZ$@j1{@NtUN p*cDm6 u4F)DRJs4.(KbrCC!#]HtZNigo?pEfwV>$:jCzmwg^]_"7`P_>޿{|j\.oVS}A=-~x"XPdX/?Mkզy7U¨13כ/lvPkd\@e y;ύTZLtg|3U7ńHeG]"Js݃[>K|Rb*:;/v#*S{]<#OOn*6:C儒7:QNq (@bG0o&DmTJͦrJΥt5kˡ9 ŕ0Ke羹89d2&xwN[sCז@jf;$gEż"̳lg,}Ť3p;~? @6q*Fa;0,6#8'GsUp~#/AE>VB΀l{j=Ux/:+x 6V)e y]YO0ne.}aUX,6~įvYg֤+ P=ZйwwhhVBdgpmHej!!@×lC}|3r3yJ*~Ui > As @<`SQ=@cQ QBҢ(p"d"Ee R I:e Q#uR mـBE l&D$y'IFy rXAJ, gE<Jc&I&JLTv[2 )X$i5b%%O"(X0HI AG R2c&9B(@B^`<a9O@0pI7Iǝ4f@0jgV60B1ܫm*Nz*T˯Yz:TdTh>lvTO=(.]gvVfjDU`W%Ғ&幾D/4K$ۗHj"&BS'# bu&3ñ".8=lH-QtCKV `Du(:Ps2=r\Whz - K8Qש_ې]Z %$zm9A88ͪppj`owD.e͇#^G-սSmDpS>{-nx-pg(wQs(Z]|їz(bC Oc W7n 9KYխX-%:_P(a vwIj4c:q/c':A'h>GyMܙoY?BBQma.sU%|͗=d2]tW㮼ǯnǂ:2!A } ti=2:( ;J@mP|hPA.NjPwZ t"XfЁ:IK.%ZT %ng x?]  9t7 8z$/gKU%g"@_HW tm_EsϻYQhy>^|$|K5Ϩ?4z&'XP/Sla>)X>/F!eZvWRSpJuvhRT ?~W ѹGݻ6|JNndB;e:|s:oJ97Nk-aWLw~Sz=xP[VWy_Usˆh&CێЅ1]U*4V)& .TrDc Ό:гėFw_MSժ,szZsD6+oIZ>L%_'ƥ~}Pijvt?,W哻?5A,_r;&F{2I3NߜVwy g$zo-uzӟˣұ/}陼_blW J"ށZvwn즒^_Ծwr~~f_|)\jV)yOwdkDmߞͧ2k*v/E\D%Qw._o'$nJ~YQ덻v0Mj΍ծ+F>^ݥRGs^r1CSeN zM~2 w\1:h:|C 6bCk7 OQ9E^J JPֆޙ\R2hZau4ZG+A):8HIu4ZG>V6^':RRqGu4ZG>VhxDh[ ꄄ,JB+SfW=V˹7K߽DfIEG%(YFH8euo3Z^߻ly^Д"Dr! VXpqԃdHA{\YBły6''a΀1pUI>5Q)i|N~WD2/6`Y =ZrA`=nNk[ޓۤ~Cg1tmQ5-ukA˫%B`)hAnCeC. ZRu|_E3Z\ _=G4,hQOAr µ^[(*9C袭GH[ 80ՎJߌ bݪSl>3Rͬ,Qf@|)X]yH][ڹ=zk[\K膾AmjG`&訯\|W1؏#@ lpB /|%Y lC,IԷC2Ŵ$&[\_r<ґfT74f^I3qt{7G#4 )؀S1MviCx8 2ʢl尝 \̑ %IJ稤H2ɓnr5fa7 t]5fNjOe7b6%v#C2/ kJFys#SP$znLtSNS]s(O´!+8iUBH}v\oo5;YYv\3Gor4..KżZ 1 ~V,UċRI !2۷pwMm8}`Q\#Oh,Zraef Alߔ3?\+ Y= x5IE n>su\y DAG6&qs]de ka =F]w:lFke]Fî(M+o|S1x"T>Y5Rzm&#\Bh5XТ(IȴQcG$+|ʘVySVrW$3FǕv75>K{{/sfN,B,M\Dр8֔@VVb+G)m̵ꬄц!K\ٵ.̏g׬;} Aoc9+p>9BNFg£z{z`x=MW~œQipE5Ƿ|}8>a:/;'{pq5}9 Ox윣vO}˟m'w3Lnm'SL[:Kf99Ţ5򞏐0Hē҃^(UݒzV)6v1oXx:ܣy܅&y}Wgq럯} _8tt\]d4. Tt\ ϰO _S<}~z C~:’BaZ'^FexM^:"5S_6E+TJj }~?܀wZqFFp"ɬ8˕+5X9XOt%dHbrBh^AJV\J>}{Rntr4i\ dO6ԱJaĀ1(q1L擉UFuUq-k}JUQ'w0*v=؆i%ِ>wMBiXjްA2IemQ}@5id-ddEy &F\f 2nBIkCT1LvTHg *d}eF1H2f阤xI.<3HzE{rlG&AiXZV8ay:ȫ2M*j/C2=ժLF[b-"tB.9 Z$Jf-$ȴ$s,f$<I~sAUVj3*=sWDRE&Ϳ|>yUSE@ DZlMz [E5Y?TL䔞^wJ1闆:{cNQ8Mv̖{#C!8b#JWcp\tV廩:D1>Q/U_ˏyqzk7IZX<}Vү@ooZ0uwۛs903OrS:}~۳ǂN9#ÛcR&Wgi~gv4Ic,>:{GNVr.9>:u.Ԧ6U[nquŽJ RAUe]^WșZkka_K.]R^ < _qY'z$:ְh7^ ,~UuY$1-t'w>]y#n7s^ Ohķu^Z󧿫&ZӊUs_z8;D2Vv ٛHY[z5@~VɓH 0&ΖlETi-A fPf x7e+]j{,{a^8?,Y\VZ(46ፋQ+PDF8k{\0w%+&y6IJ+r #])|qQXe h'ȵ|*=x2WU:# ('FA5*(S*Y'3RXKj Ue=  ^`َЬ=vA<ףBK[#"*n2F-$o5nY*)#S7^Ӏֱ(m#CJ{Wd<^}if7Wzh}Vr-(,:Ctro"܎h9VHwCk̨7 UzMcw"e}^c7+\:# 2rRef ȹ˞d#(%\!87l:-f>"2 Bmᯊlw#aU: ylqȷ& >F /4!6ms 'nT@7j4 ]cq8焎v{z2v2G،-fpֿ38g/hGx%|Ą@ş^96OTZyn|]3{7XhT%>Sw2`k^.MhBҙn) 'Ć1D;CvwPnٕwe{5@.Ȥ%ST'ƍB1.sN 0j+;p%uDbD"e+(BQ::1,a^] BjkȆ%Ju:GHpD3C,MeV1wK9ɘg4#홲'I: e% 0r<:.|4#'o-A5˧DRw/^VQ 9w⮛1ٔ5cyf̝/5ɩc.Em\5dx:7u"vrV@Ľ[[);K'j[P?n厼]Cwnut+)y<*t݁Dq,h7c7sƓGF_OǗ˿6yu8 EFS{odI:6; w>x~|43̵R 't\(kmFE˞=-~1`YgdKͶuF =`!ejI[2$ZWbYٖeNUYR4T鈰p)P)aiM߿V0l:&ڨ5xvON#z=3ցe 82=/ͨ~VR_z c4NĸZ*Frpg1u{l~њ;F|s.&:ec:z[ԓt a tջ{e\MmkUniA/bv.)l׃(')zSHF06v2 cSf w?lKx5!LaUL 8.gl9溪a$v<7[^L 0'SOaKB}=^XB&V !)P [x.Ϧ9顝նe0 V+!r 嶶Y;gjbq(&"akHPhNo[(:)>zkP .tŠ(O)NuMb͡n' ]u2>Z1:|}pJey|{¡Tj@ZGwgNǀ!Dv_! Q'9ɁG)IV.W`Tv`ȋ\Zryk֕K7yD)c): fT0le@&"`*$+ѯv6 \/W|V$iiL]Yg#}MhFi*NMf#ѦU^MWxÞtrdHP3bcLikNĕT8I""0ϟ})(Z_,ppƀ3mtD:eT'(JҔg'M-*-OB-UHb##0BD8 7h{bpk؂`By$,Hz.ByπƬ/uGC Hp(z>tBԗ _b ӆ/x!Gi×ٚ <#%T$@ظ :"iJ"Y²P[p|\Q`y<rd`}u4",&zP%J5qD_Xfȭ泇٣qxYZpכ֊KzY]Qp+Ջ̞iY^}m߂1+oKa1Įә^qfb(WG+2f]#mzsNl2l6eePlvpseֆ$+ x;ɶvcAb#:cn=Zؘ3yj&$+ bv Ab#:cn}z-߶v ?UnMHW.A2u?ݧCg/4 v*aRg-޳J+uJuggl5~O:but] ӳ8wuN+,D=,tb'*ZrWnǷ:Ĕ0t|XD||8lG6ޭf^TlCYD5|@u;hhQ{N'q0KeՔ؍7^>7 kA)@ νMw/2H^A<OnT74nѰ~._9daw[ C+6^>"ς@>j1d]&Η@۳4=|v(+3;x?K1_^:OǏu~ٞm.JU<.xeS4F7U0BH{ǹTvm8$VӨFi i%ÁI+:^;d,&-lyP[%kHE osݚSu͉&XB@@0:q>[fݛ 6'.Ֆ#:/?fAجl~ajObC`3رq{~sncmQѡ\q3w۔"wMsֹM^N~s&5Piu-yD${}.~IDӓgA<Ty r.}s޹7jpAyokMM8͝g(pxHD{4e_4 *aGC/DZ,f̷)*}'!·ۛ:g1:dIW[Y;M H]k|=w; ʜF':R9j_75QadnLdhjnXk=].0tI'1t@~mtlI:{"yzuVjjO65׷+r ~(UUEt/u>RA?}P.=3W e笨Kf0vj b3 ݳgX0sR:͜սN%P8m0(ӰH̦:Yz!Gou0==Υ}qP 5Aˇ/9ǓʁZ==~P2dwi˻jH-"Bw"['g[j>|YF"QD8RI!k;hZNܴϻԋa=ThIt\p@C *vV9YWEbSPwq|bi^N >Un0@F4/Nlq݈3t'ޡT+Gϧ|;71Ƀ1->hKχ^ `J%+u03CLaZ TjPhDk=z*Ca/KţG ~+nC3:|:pђv:CzPKПoqΙlQW6OC-(3诂!hIϗbmSk75mnh4E6}@vV+tqq/f{wMF|+^Gt|K9jz=/z Mɜ=v-x[ǽgwKlXЅEmZ/3- };7m#9cH"ٍ !|0 ,XYI>C53nKv-I|34͟Y>8]?jָťq.r9-Ur91I{, VI9vAqL Y2;*FT܏Gbp IWFΌ/K.[oRu pE4%1Ԁi)I24IdVJ2@,S1HS J]ѯv> |nT6yL7BNϋ|ʺ.F3oy1(Π&h~(#2[)vE\-%ck^|K1_υ@ޗ歭Ytu! OHHb :&$)Mc JbDygC08@ XTJUA,QQB3@F 2x 1(Ff,k8A (smanWe$ʺ#;)T\T;)f2;RbV[fE\,"HG"ӐB[In矣B Ν+ͮ./mq[30 R,)lp %l'LH[\]1kT]z.V:/.;^m&C% {NIr$|oF!C ڃ<1&tvgލ_Yy!el$E;0H]zmK~bٔoZ6%zx_SMu?-۴t"5A[6sy7.\wTs%q10%TwH<9t ƛvja 8mg'IqM?Vk6[I۩o 2 $Py6^ש8n|=KNuDD/ JI@Pq3Xke R8A |O4vm, {"1h|Ռͺ/UxN}ZT&%  '4Ld,ɐQ.!<Hcp80_mO r4,DSZ nL>1[MNwWKOʨ3YB a bF4)_Ɍxb)9 bۡCŮcV(^E=7 #zoK|kgSWezOP64N{S߬z!DGR>{iu+>9V `B̀&9G&pJn|{Sa?ԇ9Iˠ -@x +yCpeDM񋽄f=I{B[zό:yK0s(eߩ LD^e^Q1lM9 B" l8CBbQZ6ف= ~1 @aq䊌`A?YADu5D@'5BV-RzʎSL4T/*e< e8cܮ:*/V ((uC ZHH^R(ű( z$q14Q<8x%퀤h1f(Qeas,2䠉GRQEZ(Ø _~f8x;嘐3tK4̂+3$kN$[Ch+coi\vK&tRd(Y$JrR'?)Ŵ'i2)$I0BZw!}*I;]\4>xu8'Q2.gyy5Rs~/J3'2 ~J'j k*t|BɗtLϥr?//AR/ =n?25R[~LnbՋV/bZ}rgS%%Uf%Zd 9CLP8 XHeiO=w5f6SbU'B7Q"+Քt[oe@Dʱ):II-!4?nH?TʫB1! A92ugX! n9Ϊ@T"mC=lAb%d=r!H9lT^@x"!T#ZZXh;R*2ĐTru)VB.Mo< K4/5̸SLP}@w9Lm=+ \f u;v>np&V]v=DnkUz2lz9̬Syma ya}js Pu#x( . z >)GV[-CK'Rc\"Ȇ ~1s!Cz+Mzk$ k7 I)1Lu=eu [K2C3]0%z{t~'@9]}R3$F@xQդ?ޮ(; \tv=2~Yor֌赀44,Sc/cLITdw֩:;9Dp>==c(Q^}d<1 =0BGʖ[A]%lmYFpwdM+#:l j}pwc]Н\i>%3dTlݮǫ_S3ў9ġrau4 C ]' eZAONm: NK\3G%%CDdyN0:QѺJ?͵;o.(K?&^lm'#^VVшLl?|5lH"r+BR 3#]' ]7Luϻ %pf|;> B^=?U@`F=<&uګz`<`* :ޕ C/\:u[&wb]ÅF@ٿXvK:}z㞈!D>E`z+3u}(ЏcwƇ7U _d˴命N!@B^Q`%I-ǚ-$j]p+ wZ 8lQ9|oF|}*`.o= ܞ6qzNR:DWo"r:q;Q bb És }w}QMVF}y9uZV ΧPHBIjs$YPY.rZ2.s0jRdʪ݇ bmʽ l{Ն+×L9UQR^ a^pYT\ơ!*3u ,$˲>cy!ZY>gUL  Šм`FC4x!%DJi ՜V X5&Ռv7PkkH^n.M?%3turV{U,zj4/әʶJ~.blyΧ˻B _GX?\7R$yur=KŴ/}cgV Zkn O@JOߏTb#FV>`^p>sԧAB\D]dAN~lըҧs`vʧpeT>D}$EtCLs])m2]jY'w}*z0uM gddЫtvq7.VC_h{(Ob5!ɾ8yopfAoq1{=^#hZV@]NȽٮpk}1bUtYC6D '^-_Gj+ 1ۖ~ʹWԮqPH0Z` "ZbSU/G($BVY6ˬ^KȸRVpzox]͙SsfK4rvwT:'=@kFƛClKaQ~+^pA .4l<zojn-alҾn B; sJ;j%4;<Gb~Ģ@+nE@+sGMi^U=m8$N?h 6!9 M]qTˉs4L.X^t) 7H_DF6H'8G}-Op&-6>^Oux{Ⱦq޺(hbWɧ/1\0шN F;"ο^_[Y!A<†ޛg2/o?CS1@nq(jӯ<)i[M޵5q#Vȍ;*?g_:9~8Ra0E)"g+`HJ^C̍$v2 R:-k/V1Gol6AF!2bFQQ({I¡JDԠLuHE Ze_h"Z+$\Mҫ ^ǯE.;xb~´t9ݩB*z<#:Q~t%BlhZR'0N$+k ,P9:ݿ=Ko-_>|]MZݛ"s8lgDZo=OW/iT뻽ub.3oN,D-9ys 0 B޶(Pe+I`2N?Q7Ŗɲ$w0:$NeC$ G P*."헇ѻ%H$"C/XOw|+4 ]"u_ W*r&Me 0\žt^@q/·-Ku>pDL~eوAָJr$$\9X*4#Hε 4c&3 K$@~5(o,|WN Q0Jq0~\HWRˌgH:'Tqz ͜LE vJr'\YHK\<wW&Ek(t{yT%&1;[h/2S[jMZSA5%S/0ԜB/;Љ A0k68n1*$ΨD@ Jsm.@1ۗ{|̩ך=ͭH_{%P ~,`+Ȱ1/b-bJnr"/vmyxg_Ee:}do.^d}.|!Dke~+"QY?M֞K#e xNhͭ}4s#*&׷1Gm[=_;d$MM4.1J!dGϋ ͐f=W6C'a'{qrNdm }VQU6^qbpqw4f?)6l8&e;h+k(@o\YZ0C( 0J@ȱH4 `Op >m]-T ޺Fի^ K)6{9QC;C*'w"T=!"r_/E w=77+_haؗ nghlfs3{^)' #`N댨L8RmU0"Ҍg*Rbhpaz)W}^kQkxѪUU}ݬ5΢_1>d"\CfsO-!mH,Y^DIN` BXmDjʬ0Ι $( ]crE.v 'ܨO|?p$WyG)pz[K$?(<0[4OG?k 'Fl"dmP5Z*~ggV߹_? K6ORFGwo8KFX?7v 8EpzqF YXlM$Jj6QV.I,Ը-d~z?N8q'=Z yNSQ+ voDBwPѷJ6pZtPӭ[(>Cv#ު#RbUCFX1Hx_j퉜~Ծy_ eMy]Ta>ɷuPFx%8h>nWjT)\DŽ^b@suff!!o 9"uަT(tu=k,iJvh]@:;;]u/(,tӫ%ꕕH x%*p4XVcU8QQRh 3/N1K\aFpMr_>^foaXvHd)`8lBDڔp%kn'BUاi84ET`FZp.|(=8ad,vCJb,(o)knД:$n*tP*<(afx7ԝ9ؙp)G IO2RܚzBy6g3]ntn "/ԛ;A맴?O?^v?53{.˿ߌ6b5WxsTӋ ۋgVCԼI)sŇ&YLFhvhy%iX1s5\@.%BtE.'DRhAAc$H ZdWb"2e2`eKY& S5içqFA [$ksB8W CVYG+2TV @B2ᗁx;@r&Ȋc$$tȊ%;U`شu2ZLn.DJ JNGn|ˇ%g{=9XNF[akl1VucCf\VC G459dfzvk6r ޗkKCH hGWđӭ[G*0**nS/CS=PB|Z*TGħ6B!m9 Pő(\R^ֽX 0&%{30$G[W CXd/T\VD6ޟ5RQ 0I\xQU;vU[r -j"YMZD̨[2u B()+Vg:I2_d<]~e›ߌۀ㮬5f<\R%)~RU6=Tb&l\l^ A-){|0ܢFdKL,Zz4+.uO@8* PFj6V`P|P-JE/}[ĢTr~X^t+훇/@}2+ElJ UPZZ= S ԲZHĠa8 z>E'AgawzԔ`R5Ns!ޜoN^PX\.>v̐I:DzP8|J ڋs!'oag{H+a8Cn  P:Ϥbџp핃OeϬTNǽ(1xFŤn%&`s"`J 6mM!y*S*APuQ(0LRʚQ\(Hx+8Ȅ+b4E ! R$B+~xI*CD  }d'i@l B83+ߧB*C"zzc@co.\1KRO0HI!'2yD$ȚܽܽE} 9`| ,;bbgg7x󏡻Z_j+VI.oBŐ( GmM7h!K F ŭ63 eNWkZ:r[͛fK2ɉwD '9-=hQs2]?z߭ۇv>rq#tZS}&\^:o~}Ǐ͇Wl̢[7n](t?D%IL@~ll/B!3?rZ*;\OOy7P܉Eut?fʤ.!& A0pj ~v:8#|"iW3}㈟6| ՃW>)c^"A?om$kM\B_ƶڼ\${bSU=:T>5Gh "jyR(JhMO}uK'=}v0G(_128Vi|s:aZFRsݠ 2\ǿRz38͜Z[Wi_Ja|rʱʤrƱ?7vw&C:1v,;7Q#>{7JL1b:en}f\}xue=Sdm%Ũ|ڄʀo`hEFdV>OnTv>\l }qHieFET{zr)bOEށ 2CM1̭PøF!{Wsh.Bs7}WQGr\}MI5nOF,ĐE3"\q&"x"Q& ʜ4M.40;Dse쒀6٨wg'BVYu@cJ]Ft8b;QgRJ NWb#GTa@~7u),uG-JL(r,CO?U?X2N':]U&Wd y`kGNevBWH޽C'ɉjA[ 7|Ws >"ӪM|K֚>IX"eCOm $}imjX|$0M/ 2E@3[N~>g7Lcc 4"/P~[-t2V$ŒDfzL ]ߦ߀YĒwz*d1u=fFqW>ǔb -3~v=nG-[H@-(-01ekO>BC-Z z{IFՒ뤁m$|5 0؎ / vxLB-XVdM[wJvRէm>iFܺo0S٬jq-[Xv043HseV{ RUUeF>!KInIeӺf-u HxS #ئəo5jkxTMo7+-{io;6 `>Cr˻oDqh]_==쟯/x 'gKn @ lD (&v>Wչ˻?gzSzS,S>l `^ C9O;?@9P$ 1U'"SbNq,Ga5\0d 7"$/|5q_NRk7_0+t՗@m"a>ʣWGRI@{^iP]qKWJШӭ>#>I兔@'*-FN[xgHB{Ь\ ެb9֧T!ݞWlu ujv9)nݹiנ5E/E*Hߋ+.Nl{3sfء-DDRegPzHX3 B~ _R>oGq1jI/BY: ^=P!7$iu-Lv s }$C=ruK Wn^n~0W,r vMG̭Z QpeWCy]:U>9& :KCZ \Tnod8\jr$\XJn0$tyy1tWW1-Ran WIi7?E8`YWΛԤs|{7[$)h_NٟggwUoqC3mo|hat( w:jmĮFFAv9NpC ;{w !:6n[+]PƐ ZrA> "LZ>y|c2b ZJm =Pj"كJ朹$[J@[(jt*[.^Е&}-opEb$\B%QS$[6 hLBqNH?>K}iһkb^PpI]UΉCBERsB҅Mi.IAr,Az!RΖQzG9/*D9,s˰`P7Zjñ$c(8`T/U Tq} P>0[!KYB!6w%- ڿ9OJJLO%R!$>&9͏?ylcź;4/C]F%$ǮxIK=;*Jmgmgn҂_)ih -6$2,e霩l]v0{ݏem<gwfzkrl6bnxD!{$dy7Mx)\$_~50m8èCcdǃM#J)X5 3(B|Ͻt$ΓmnӾF e4k.Ҩ[k(c43w|4h40evΞhc 2ӆ# %ICa/PrH#> >f.E0 /dS?W3 k_9D[Z|m6V; Z bti-9ֲQc݉Ƶ RWU@T(0ϝ|}\U3"]i ֞V{'rMXk/'?1oίOz@Q}vΘ #/t}\J/-t´A˃e"f@Ǡ`ջ??m)՘TGKqbb=N.dLZDMl*;KHܻq:fA1-w1zLؙw3ڰD@WU8xugF̄6+J b2oi4]};Bx{'7i] ?ݨN׳}Sgd0P T0{*LkՏW^)bOEށ 2DCNx6!-|GN`[LR$B3~WQG\}MkW0\ Կ$ ؙ`OMr|qATUؐe mY%R E O)J|/(}jK )'΀1PTd^pv@![[g+RP$DU̓#ESŶ2j'KHy݉Jן,'vI;U>0AQ@D%&TJOaw$Z{{DpM@TY4ghi_ 3 %`0E.o[թNF8ࣻuA,>Ey搏9ܢTH3߄OGF^,[7qQeaMSZDJ7&b\,T pߖ=4HGv#SfX(;>|ˤcV q܈"[Sqy8p,}3\u-uq1)rq9e-ǥjH[qCõ`<`Pk֎yȳWRl1,F[Am[mc>hl1$,1Zm)@&NaE9 =mHb'D`&d5`k̨^} gdEoֱ Ӛ\硫%r5nO)`#4-ze8*IOӲ7 2 Ԩ*W=&]k$9ڽK`@F ~@?+m#IGD6vCҘ,RRHJŶbU_Ƒ].:11c4ϊ;I:&Pު-*}z!)7 L> #f!pBPծ"O^;hFI;@*e.2w1Rp Tf bk//Nnd‡i(J٨7սx9uBOor4 cq?R/gd']C]DMޯxgd tD:ei{~gƓ w!]_\&nxWf CcM^VY͹6 * }m@vV/l'S)c:N҈tR ?pJ&KχͿ>㗛b˯ )_GxNYrW?l$3yTJh^Rrrq99:{e^}F+9iV%}HfE]4km5w 2Ȝ\Êf):&}sTzJ'Ԡ:G,=W԰\WXjsK}(5Ng.:}sշZ9\t6>qYt $UYB-Ajql^Z/3vgP,U} t TqkkkYŬ+lƧK ujpr!*7Ɇ !ւѷv3_ζfDi6 (c_e% B1en`~OjbЪb78daՆ4F y¬nTޏ\˘d,S򉗳6Đ^RsOΧS6Mp05.ţ -8:?lBp0綾*R_IKݛ78_.<2!BUǚJj+u^(9|?Wh  ،Xx%U+gϸw Nw*=%RJ>=S_JCҢ6k?a4$ےFc5]co@j:(٭reD) Ey(scƷH-U~q3G]ʣDJP\4E~vի_(iXޑ,cx& L%nu P_?.5M)gI0p\pA 1$;z^.kerq є6SehATN)+FY2'2#.K.e dZzsv=Dcк!$&`%AŃ0, TPNscd!#lvWuy] P3Ԯn,;u*{;qF4ȅjF˾ÍQ({~hHQz,Q\RûDf K17h]zS˔`|/u}<>^ =a72Do1;Byl)M߽XI'Ng׏AȞ:IZ!HZ-fnUΉvˀY(#O٨96ȬHnj+CmyV}:"&=>I M_!uzR*njYD2G~ ʕr##ͺ]9@h|̼҈zN?X}4˺#ʆE!Ǻ#SMJeLIrz+fG".㶒|J#^}R/JR_tY>˒"vf]/]Mxr+}o ZKŷЏx@ϥo# Y%:^tUڰm7il~}@2r>JƓM  2+Iiq菜ۭ]Jwm:)YCɵSr f'\q^a$v#~GtۯX}{cȻ 7 m=Gwn G[oz\G[kmkkFN-ѵ\4zf5J߂gS=x)o%yGoӟ{CJ>ʣG:bZoe& Q|VA0lm4^S s k0xk'kI+Dn;v~4ǧ^X]AbG(U!,|w7=CK}fkAʝQYⷎLtQ !;ˋLمZ}AчEqUp^0.;/Q4)^qY{4W:a,+m\6`e!XWL4sz .9p‚koL)6 N^΃<,fI Q]rųӝBvKeCq;lo7mH_ɺHhPdk>S#XdFtHN E1pPI$V%NG? }KMcÈOB)ȼ\N0lb^_= PYXt*dT{0p>'*0H0ۓ>TB8$VGVRN6 H1s8;u6S^8Io\ ªqR&mAt!Vx0hØi6a%B KA fo!gf֘ݞF3JDgc.G>٤2rM Z7Q'RI pĢ3I&e14԰ (g^Jxyيà`|a<nsOX)PX9{`dQ1Y Z2A@uY=\j>Uv Te F,Q†HvƀאZS!'zPA$D:gZIJ)A fp!aHI5Vy^Tj#/| +Tfś2,aa7Gr5B(0y e}VF[  T!ǠZ'A3%hɂr Z7"%ꘐ6Be0l<aKmo zg&1`,x, w 9 KNL (նD`01!@Y΋YG)tJ{9ӱQOI$϶qY5(;'2 !yP4+xlP(; fؠq{'[w 5W; Æ~^Jo%-ޱ7 mтiײgOmO\FJe`&1"o/}eH}Fx{'O+m#I60I)wٞt<%ZL} o$((e&:;·_k02ȘyJWAPIb"yODAJi||O$OD}FU1:E& HP `hҦ;^õ^]*/5e6\!WGI(6Hמ̹_̪rDb2wD&4hm٠mlMxfm ſt*c}A&fbD.؊aGSyQO'[q&(=#] xtPAgc&ù(f9YMRVH"s⳱oPyz !*PFwHљ(,%oZ(y5s)Jk *7ZtL'cJ[<;= F^F+=ygG*)IrAb`#4pA6<ގs jk/c/ʁS B*( dFT[h3jwhW-1,X/r  b #Fax0zfU^؍5hl;Iypqάᨔell<.ާ\{BR%7ߟŐNBMކvΔtKA)w'ߛj&vE6GU6qpT喖eou%tjM) Њe 7Tha:>lc*1ɟO zD/{yӒXpD*ላc6,$`έQw Ghb/>79U %Z*"s:;b pVִB.Kx [l fϙgy\"'[Ida ̼L9J%O.V{ ?s̲$R&1-*)7 u߀Կ<`qѼ\!%a\r`BqcYDRIBp:(i9d A;l>-M8.pc U%m`0d[kF wt x# C}ܒT"E.|J^hEܾPA얣Gt8?YEPhԯW(F%Gx^׳x0JOcDn nvX`[E) ;b Q`l\Y) 'L7CSf~p} ũM[=vP~FJ3$\v bs`$h N^8C1ZFø`htNx 1>S>*1h"-:ޡ& Y1fT|dr\-b|H Ceb=4^erJ 3L9׶>9?Zl`؎ҥXNp|ô)XE`lF4J"n8eԱ9F4+]lR ?1OPL$LMYlt0@k8ռ>yADi֠asT c BYioM~:&=r_ӺN#$}M'QxJ1_M|N'+,ZG>ͧ?8a '2 !-8oOzd~^^6:-i9Y68hNr;~Zm#/O_S+MM.ٲxV&q8oF]/.AVQs c[3Io,% m2Ab*_i~ڭConvO/8Fhy6\*'魟aӍx9GG:q/;w\1yMvYs҇ %4w1ի'wIo9,>_C 2AdћfKoyGq\ɭ`"lYu`V.Vze"-ei˦Y(b:EaE4u*J':] Xp;#/N4-V?Yԝ)>VUڬC*F>[;;n/kI@3-KvLu:f{[[!"ز\4:7Rc-óɸ[A V,xK6v6Y}Ka:K|DgLw-q8BVҳ$>p![Nm\Sl>\^KW|zRk4#=E6AxW\e :lO,nl!zERjX˫/} 67\#PYġ:){dU1I(Lj3b3vmآvH-g~釂꩓5y$=SzcoCҭ^]O~YYd4!:mGT6 "XmY8FEX2mm4 C!FOHPd5"#bF _Y=]0zSPȏ`BiQ/}0z}d y%D +3V?B| bP9J"!aDQ#u7eҡZ[0L.7MFC9#2[vW( р×ҤɠOdd,G/ݪD`X/ >Q¨f@}Pl=On?~y6m~\vu=m;ߴZT.>},9.35Rp7OO\~%좞ǫ4%|v2PM?NwxAfio1~Ӎ>8NI#fx`ɹY;0Z8q{0`%['BJ6Y6ɟ]1FWnj+ӗZB O*ڜiT:(⼳ӥ۝ (Oo?-L2Н2(\i1!:v4'1rpHYM5PFXPqEh1!WnD"c3;WؼkXݫ>@ʑ;X/@KkKG9;@;7G3"=r2 [X10T+10ݿza2(GNёmޭYI8YI$GV[?w+ Pr~%5Tvr⳽A;Gד=aH 3;mT @#tĽ- ah- D4Kզ-TW9T CEĄ;WIerR2=>co֥tj1(a}EUfoDgC!hOH89 Z J\CᲯΦ`ѭ0M!߸V )>|xzl!bP:9xPVz0M!߸̤Enjڢ'Ij ZHRk S풤z9d-z|lrv$1p% T[*C{TR]ր?yUPf0}>9%^9DF8{#F ue$ɩnl8ާp;q'r>QcfrmyռG%6VV;ǃ _#Ě-t- +'q|;!p} Gރ"$=(WjU ^BT$́%(A`V6F]kM+,iZȻ$* { "Q)W yE%j59Djb=2;ŭT4y?jƛG-?R!BWԕ5;ϱsaýa5|wv%hPHjxT{VBN@BI5DOQOh 5iP@l\l|vm2rI [ʃ Vd1Cyj XgE6+ Zآ|9a6\s źwڗS)DDŽJgV(jrЩ2 )mJ-T6g*W# -Նo G$w1-CޒQ$JX*vZ 7>(lB&'!"9 A";/gXu=VG/5J8uΘ4BJ9;f-+cb(rOLNf#Bۚ ;AÇvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004200722415145110035017671 0ustar rootrootFeb 17 15:20:37 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 15:20:37 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:37 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 15:20:38 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 15:20:38 crc kubenswrapper[4806]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 15:20:38 crc kubenswrapper[4806]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 15:20:38 crc kubenswrapper[4806]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 15:20:38 crc kubenswrapper[4806]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 15:20:38 crc kubenswrapper[4806]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 15:20:38 crc kubenswrapper[4806]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.927460 4806 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934455 4806 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934485 4806 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934489 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934496 4806 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934503 4806 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934511 4806 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934527 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934535 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934539 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934544 4806 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934548 4806 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934553 4806 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934557 4806 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934560 4806 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934564 4806 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934567 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934571 4806 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934575 4806 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934579 4806 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934583 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934586 4806 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934590 4806 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934605 4806 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934609 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934613 4806 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934617 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934620 4806 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934624 4806 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934629 4806 feature_gate.go:330] unrecognized feature gate: Example Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934632 4806 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934636 4806 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934639 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934642 4806 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934648 4806 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934652 4806 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934657 4806 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934662 4806 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934667 4806 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934673 4806 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934678 4806 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934684 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934689 4806 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934696 4806 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934702 4806 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934707 4806 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934711 4806 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934715 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934719 4806 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934723 4806 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934727 4806 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934731 4806 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934734 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934738 4806 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934742 4806 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934746 4806 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934750 4806 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934756 4806 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934760 4806 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934771 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934775 4806 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934779 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934783 4806 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934786 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934789 4806 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934793 4806 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934796 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934800 4806 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934803 4806 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934806 4806 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934810 4806 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.934814 4806 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.936925 4806 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.936957 4806 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.936983 4806 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.936996 4806 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937004 4806 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937012 4806 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937023 4806 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937031 4806 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937036 4806 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937041 4806 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937047 4806 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937053 4806 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937058 4806 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937063 4806 flags.go:64] FLAG: --cgroup-root="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937068 4806 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937073 4806 flags.go:64] FLAG: --client-ca-file="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937078 4806 flags.go:64] FLAG: --cloud-config="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937083 4806 flags.go:64] FLAG: --cloud-provider="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937088 4806 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937098 4806 flags.go:64] FLAG: --cluster-domain="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937103 4806 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937109 4806 flags.go:64] FLAG: --config-dir="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937114 4806 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937127 4806 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937137 4806 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937145 4806 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937152 4806 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937159 4806 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937166 4806 flags.go:64] FLAG: --contention-profiling="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937172 4806 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937178 4806 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937183 4806 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937189 4806 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937196 4806 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937202 4806 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937207 4806 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937213 4806 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937219 4806 flags.go:64] FLAG: --enable-server="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937224 4806 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937235 4806 flags.go:64] FLAG: --event-burst="100" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937240 4806 flags.go:64] FLAG: --event-qps="50" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937246 4806 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937251 4806 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937256 4806 flags.go:64] FLAG: --eviction-hard="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937263 4806 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937269 4806 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937274 4806 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937287 4806 flags.go:64] FLAG: --eviction-soft="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937293 4806 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937298 4806 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937310 4806 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937316 4806 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937321 4806 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937327 4806 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937332 4806 flags.go:64] FLAG: --feature-gates="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937339 4806 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937345 4806 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937351 4806 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937356 4806 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937370 4806 flags.go:64] FLAG: --healthz-port="10248" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937375 4806 flags.go:64] FLAG: --help="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937380 4806 flags.go:64] FLAG: --hostname-override="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937385 4806 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937391 4806 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937397 4806 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937423 4806 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937428 4806 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937432 4806 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937437 4806 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937441 4806 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937445 4806 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937449 4806 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937454 4806 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937458 4806 flags.go:64] FLAG: --kube-reserved="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937462 4806 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937467 4806 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937471 4806 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937476 4806 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937480 4806 flags.go:64] FLAG: --lock-file="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937484 4806 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937488 4806 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937492 4806 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937501 4806 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937506 4806 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937511 4806 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937516 4806 flags.go:64] FLAG: --logging-format="text" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937521 4806 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937547 4806 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937553 4806 flags.go:64] FLAG: --manifest-url="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937557 4806 flags.go:64] FLAG: --manifest-url-header="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937564 4806 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937568 4806 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937575 4806 flags.go:64] FLAG: --max-pods="110" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937579 4806 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937584 4806 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937596 4806 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937601 4806 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937605 4806 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937610 4806 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937614 4806 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937626 4806 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937630 4806 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937634 4806 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937639 4806 flags.go:64] FLAG: --pod-cidr="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937643 4806 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937651 4806 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937655 4806 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937660 4806 flags.go:64] FLAG: --pods-per-core="0" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937664 4806 flags.go:64] FLAG: --port="10250" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937669 4806 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937673 4806 flags.go:64] FLAG: --provider-id="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937678 4806 flags.go:64] FLAG: --qos-reserved="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937683 4806 flags.go:64] FLAG: --read-only-port="10255" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937687 4806 flags.go:64] FLAG: --register-node="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937692 4806 flags.go:64] FLAG: --register-schedulable="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937697 4806 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937708 4806 flags.go:64] FLAG: --registry-burst="10" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937714 4806 flags.go:64] FLAG: --registry-qps="5" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937721 4806 flags.go:64] FLAG: --reserved-cpus="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937726 4806 flags.go:64] FLAG: --reserved-memory="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937735 4806 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937741 4806 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937746 4806 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937752 4806 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937757 4806 flags.go:64] FLAG: --runonce="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937763 4806 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937768 4806 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937774 4806 flags.go:64] FLAG: --seccomp-default="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937779 4806 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937784 4806 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.937790 4806 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938049 4806 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938058 4806 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938064 4806 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938069 4806 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938074 4806 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938080 4806 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938085 4806 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938090 4806 flags.go:64] FLAG: --system-cgroups="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938095 4806 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938108 4806 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938112 4806 flags.go:64] FLAG: --tls-cert-file="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938116 4806 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938132 4806 flags.go:64] FLAG: --tls-min-version="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938137 4806 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938142 4806 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938147 4806 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938152 4806 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938157 4806 flags.go:64] FLAG: --v="2" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938171 4806 flags.go:64] FLAG: --version="false" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938179 4806 flags.go:64] FLAG: --vmodule="" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938185 4806 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.938190 4806 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938330 4806 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938336 4806 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938340 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938344 4806 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938350 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938354 4806 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938359 4806 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938363 4806 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938366 4806 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938370 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938373 4806 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938377 4806 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938381 4806 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938385 4806 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938394 4806 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938398 4806 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938417 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938420 4806 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938424 4806 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938427 4806 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938431 4806 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938434 4806 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938438 4806 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938441 4806 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938445 4806 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938448 4806 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938452 4806 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938455 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938460 4806 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938464 4806 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938469 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938473 4806 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938478 4806 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938482 4806 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938485 4806 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938489 4806 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938494 4806 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938498 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938503 4806 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938508 4806 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938515 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938520 4806 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938525 4806 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938530 4806 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938536 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938541 4806 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938545 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938549 4806 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938553 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938557 4806 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938569 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938574 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938578 4806 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938582 4806 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938585 4806 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938588 4806 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938593 4806 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938597 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938602 4806 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938605 4806 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938609 4806 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938614 4806 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938618 4806 feature_gate.go:330] unrecognized feature gate: Example Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938622 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938626 4806 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938629 4806 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938633 4806 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938636 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938640 4806 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938644 4806 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.938648 4806 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.940305 4806 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.953381 4806 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.953473 4806 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953733 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953772 4806 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953783 4806 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953794 4806 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953803 4806 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953812 4806 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953821 4806 feature_gate.go:330] unrecognized feature gate: Example Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953829 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953836 4806 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953845 4806 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953854 4806 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953862 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953871 4806 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953878 4806 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953889 4806 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953900 4806 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953909 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953917 4806 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953928 4806 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953938 4806 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953948 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953957 4806 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953967 4806 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953976 4806 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953984 4806 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.953993 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954001 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954009 4806 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954018 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954027 4806 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954036 4806 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954046 4806 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954058 4806 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954070 4806 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954079 4806 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954088 4806 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954096 4806 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954104 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954113 4806 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954121 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954130 4806 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954140 4806 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954150 4806 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954159 4806 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954168 4806 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954176 4806 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954185 4806 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954194 4806 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954203 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954211 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954220 4806 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954228 4806 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954236 4806 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954245 4806 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954252 4806 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954260 4806 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954268 4806 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954276 4806 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954283 4806 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954292 4806 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954299 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954307 4806 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954315 4806 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954323 4806 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954332 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954339 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954346 4806 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954354 4806 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954362 4806 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954370 4806 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954378 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.954392 4806 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954665 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954681 4806 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954691 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954699 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954708 4806 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954715 4806 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954723 4806 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954731 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954739 4806 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954747 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954755 4806 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954765 4806 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954774 4806 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954783 4806 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954791 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954799 4806 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954806 4806 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954814 4806 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954822 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954830 4806 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954837 4806 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954846 4806 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954855 4806 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954865 4806 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954876 4806 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954887 4806 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954896 4806 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954907 4806 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954916 4806 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954925 4806 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954934 4806 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954942 4806 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954950 4806 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954959 4806 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954967 4806 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954975 4806 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954984 4806 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.954996 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955004 4806 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955012 4806 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955019 4806 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955027 4806 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955034 4806 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955042 4806 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955050 4806 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955057 4806 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955065 4806 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955073 4806 feature_gate.go:330] unrecognized feature gate: Example Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955080 4806 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955088 4806 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955096 4806 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955105 4806 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955114 4806 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955123 4806 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955131 4806 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955139 4806 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955147 4806 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955155 4806 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955163 4806 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955171 4806 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955178 4806 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955186 4806 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955193 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955201 4806 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955209 4806 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955216 4806 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955224 4806 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955234 4806 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955244 4806 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955253 4806 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 15:20:38 crc kubenswrapper[4806]: W0217 15:20:38.955262 4806 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.955275 4806 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.956600 4806 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.963564 4806 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.963726 4806 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.965926 4806 server.go:997] "Starting client certificate rotation" Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.965979 4806 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.966244 4806 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-27 22:26:34.262950955 +0000 UTC Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.966463 4806 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 15:20:38 crc kubenswrapper[4806]: I0217 15:20:38.997131 4806 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.001792 4806 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.002533 4806 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.021112 4806 log.go:25] "Validated CRI v1 runtime API" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.054518 4806 log.go:25] "Validated CRI v1 image API" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.056299 4806 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.061035 4806 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-15-17-02-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.061067 4806 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.081171 4806 manager.go:217] Machine: {Timestamp:2026-02-17 15:20:39.079243925 +0000 UTC m=+0.609874366 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:aa772b6b-8722-482a-a8e2-1dcbd24be6c8 BootID:0e8a92da-ef57-4d82-8286-19572da4098f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:29:1d:5a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:29:1d:5a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e5:53:16 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9c:4a:9e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6d:af:4f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5f:75:a4 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:5a:40:5b:53:b1:67 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:b0:dc:17:16:97 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.081472 4806 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.081686 4806 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.082369 4806 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.082886 4806 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.082956 4806 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.083314 4806 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.083336 4806 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.083985 4806 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.084040 4806 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.085192 4806 state_mem.go:36] "Initialized new in-memory state store" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.085326 4806 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.088845 4806 kubelet.go:418] "Attempting to sync node with API server" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.088909 4806 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.088964 4806 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.088992 4806 kubelet.go:324] "Adding apiserver pod source" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.089018 4806 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.093932 4806 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.094999 4806 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.094988 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.094991 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.095117 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.095120 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.097567 4806 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099011 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099061 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099076 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099092 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099113 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099126 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099138 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099159 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099175 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099189 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099207 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.099221 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.100071 4806 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.100711 4806 server.go:1280] "Started kubelet" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.100888 4806 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.100966 4806 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.101790 4806 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.102100 4806 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.102983 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.103025 4806 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.103070 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 13:48:48.600257606 +0000 UTC Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.103293 4806 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.103378 4806 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 15:20:39 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.103527 4806 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.103631 4806 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.104035 4806 server.go:460] "Adding debug handlers to kubelet server" Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.104261 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.104330 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.105877 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="200ms" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.106611 4806 factory.go:55] Registering systemd factory Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.107692 4806 factory.go:221] Registration of the systemd container factory successfully Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.111149 4806 factory.go:153] Registering CRI-O factory Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.111212 4806 factory.go:221] Registration of the crio container factory successfully Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.111371 4806 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.111444 4806 factory.go:103] Registering Raw factory Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.111483 4806 manager.go:1196] Started watching for new ooms in manager Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.111215 4806 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189511d5f6742d9d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 15:20:39.100673437 +0000 UTC m=+0.631303888,LastTimestamp:2026-02-17 15:20:39.100673437 +0000 UTC m=+0.631303888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.112778 4806 manager.go:319] Starting recovery of all containers Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121467 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121553 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121577 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121602 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121624 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121643 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121694 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121716 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121742 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121763 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121783 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121804 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121825 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121850 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121870 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121894 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121920 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121947 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121970 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.121992 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122014 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122035 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122058 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122080 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122100 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122120 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122145 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122167 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122191 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122212 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122236 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122259 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122279 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122298 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122319 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122340 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122372 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122394 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122447 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122466 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122488 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122508 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122531 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122553 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122574 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122596 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122617 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122641 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122665 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122691 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122713 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122734 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122762 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122786 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122813 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122833 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122855 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122876 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122895 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122916 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122938 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122960 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.122980 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123000 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123022 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123040 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123061 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123082 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123102 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123122 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123143 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123163 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123183 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123202 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123224 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123245 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.123266 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125177 4806 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125208 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125222 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125235 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125247 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125257 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125268 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125278 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125288 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125297 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125310 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125328 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125339 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125352 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125368 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125379 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125395 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125422 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125439 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125448 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125459 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125470 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125480 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125491 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125501 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125511 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125521 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125530 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125546 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125557 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125568 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125580 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125593 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125604 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125616 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125628 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125639 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125650 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125679 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125692 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125703 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125714 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125726 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125737 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125754 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125765 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125774 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125785 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125795 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125805 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125817 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125826 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125839 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125850 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125861 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125871 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125881 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125890 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125899 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125908 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125919 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125929 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125940 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125950 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125965 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125979 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125989 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.125999 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126008 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126018 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126028 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126038 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126047 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126058 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126068 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126078 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126109 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126119 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126128 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126137 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126148 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126157 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126168 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126178 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126187 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126197 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126208 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126217 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126226 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126236 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126246 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126257 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126267 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126276 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126286 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126296 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126306 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126318 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126328 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126337 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126348 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126358 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126368 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126379 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126390 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126400 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126434 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126443 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126454 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126464 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126475 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126485 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126495 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126504 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126515 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126524 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126534 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126544 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126553 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126562 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126572 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126581 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126591 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126600 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126609 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126620 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126632 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126643 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126659 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126674 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126691 4806 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126701 4806 reconstruct.go:97] "Volume reconstruction finished" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.126708 4806 reconciler.go:26] "Reconciler: start to sync state" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.151491 4806 manager.go:324] Recovery completed Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.157477 4806 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.159570 4806 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.159613 4806 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.159731 4806 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.159803 4806 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.161816 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.161886 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.164187 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.165663 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.165719 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.165731 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.166656 4806 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.166692 4806 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.166723 4806 state_mem.go:36] "Initialized new in-memory state store" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.182158 4806 policy_none.go:49] "None policy: Start" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.183586 4806 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.183625 4806 state_mem.go:35] "Initializing new in-memory state store" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.203875 4806 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.235823 4806 manager.go:334] "Starting Device Plugin manager" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.235879 4806 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.235892 4806 server.go:79] "Starting device plugin registration server" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.236396 4806 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.236434 4806 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.236632 4806 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.236987 4806 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.237009 4806 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.244704 4806 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.260148 4806 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.260246 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.264665 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.264710 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.264726 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.264917 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.265729 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.265914 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268164 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268198 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268209 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268167 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268250 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268289 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268532 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.268864 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.269146 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.270416 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.270444 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.270455 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.270565 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.270766 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.270795 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271017 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271060 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271072 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271331 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271356 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271385 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271472 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271505 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271514 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271619 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271682 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.271716 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272272 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272290 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272299 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272343 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272377 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272392 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272455 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.272494 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.273064 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.273087 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.273096 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.306795 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="400ms" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.327970 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328014 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328041 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328064 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328086 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328106 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328185 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328255 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328320 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328367 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328430 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328465 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328488 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328509 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.328529 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.337171 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.338668 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.338723 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.338747 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.338788 4806 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.339341 4806 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429654 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429766 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429808 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429842 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429874 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429905 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429936 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429969 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.429986 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430091 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430005 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430076 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430171 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430190 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430204 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430216 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430204 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430329 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430242 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430240 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430243 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430456 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430496 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430602 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430531 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430690 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430785 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430844 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.430923 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.431038 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.539698 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.541332 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.541374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.541390 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.541432 4806 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.541935 4806 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.604873 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.619148 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.627665 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.647551 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.652735 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.659386 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-255108b50a058b34fcf53bed0478d11ee770b2b08c393d8f1aff75159754cf89 WatchSource:0}: Error finding container 255108b50a058b34fcf53bed0478d11ee770b2b08c393d8f1aff75159754cf89: Status 404 returned error can't find the container with id 255108b50a058b34fcf53bed0478d11ee770b2b08c393d8f1aff75159754cf89 Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.661544 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-522c5603f9379a45a8f9999a00f7041980cf5ca0266962ba8e3adeb0079e818f WatchSource:0}: Error finding container 522c5603f9379a45a8f9999a00f7041980cf5ca0266962ba8e3adeb0079e818f: Status 404 returned error can't find the container with id 522c5603f9379a45a8f9999a00f7041980cf5ca0266962ba8e3adeb0079e818f Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.670922 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ff12e8e0cbabbcbcb93ae149d6e180f3374d201e677222c1ce019b998c8ade3c WatchSource:0}: Error finding container ff12e8e0cbabbcbcb93ae149d6e180f3374d201e677222c1ce019b998c8ade3c: Status 404 returned error can't find the container with id ff12e8e0cbabbcbcb93ae149d6e180f3374d201e677222c1ce019b998c8ade3c Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.675656 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-39ac8cf40e9d4ecfff9dcf0a3251fd993fa2478ec60ab7ec63b05774dbf2b618 WatchSource:0}: Error finding container 39ac8cf40e9d4ecfff9dcf0a3251fd993fa2478ec60ab7ec63b05774dbf2b618: Status 404 returned error can't find the container with id 39ac8cf40e9d4ecfff9dcf0a3251fd993fa2478ec60ab7ec63b05774dbf2b618 Feb 17 15:20:39 crc kubenswrapper[4806]: W0217 15:20:39.678139 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7fee3481f0b9bf97a0dcde0e4e8712cc82f80f2b7fdedcfc2933cf8337d7527b WatchSource:0}: Error finding container 7fee3481f0b9bf97a0dcde0e4e8712cc82f80f2b7fdedcfc2933cf8337d7527b: Status 404 returned error can't find the container with id 7fee3481f0b9bf97a0dcde0e4e8712cc82f80f2b7fdedcfc2933cf8337d7527b Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.707890 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="800ms" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.942855 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.945155 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.945259 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.945290 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:39 crc kubenswrapper[4806]: I0217 15:20:39.945351 4806 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 15:20:39 crc kubenswrapper[4806]: E0217 15:20:39.946313 4806 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.103157 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 13:09:29.173129695 +0000 UTC Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.103446 4806 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.165166 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ff12e8e0cbabbcbcb93ae149d6e180f3374d201e677222c1ce019b998c8ade3c"} Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.166545 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"522c5603f9379a45a8f9999a00f7041980cf5ca0266962ba8e3adeb0079e818f"} Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.167726 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"255108b50a058b34fcf53bed0478d11ee770b2b08c393d8f1aff75159754cf89"} Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.169194 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7fee3481f0b9bf97a0dcde0e4e8712cc82f80f2b7fdedcfc2933cf8337d7527b"} Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.170531 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"39ac8cf40e9d4ecfff9dcf0a3251fd993fa2478ec60ab7ec63b05774dbf2b618"} Feb 17 15:20:40 crc kubenswrapper[4806]: W0217 15:20:40.213949 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:40 crc kubenswrapper[4806]: E0217 15:20:40.214025 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:40 crc kubenswrapper[4806]: W0217 15:20:40.268263 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:40 crc kubenswrapper[4806]: E0217 15:20:40.268345 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:40 crc kubenswrapper[4806]: E0217 15:20:40.508891 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="1.6s" Feb 17 15:20:40 crc kubenswrapper[4806]: W0217 15:20:40.597346 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:40 crc kubenswrapper[4806]: E0217 15:20:40.597530 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:40 crc kubenswrapper[4806]: W0217 15:20:40.661323 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:40 crc kubenswrapper[4806]: E0217 15:20:40.661465 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.747192 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.749295 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.749349 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.749362 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:40 crc kubenswrapper[4806]: I0217 15:20:40.749396 4806 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 15:20:40 crc kubenswrapper[4806]: E0217 15:20:40.750061 4806 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Feb 17 15:20:41 crc kubenswrapper[4806]: E0217 15:20:41.103167 4806 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189511d5f6742d9d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 15:20:39.100673437 +0000 UTC m=+0.631303888,LastTimestamp:2026-02-17 15:20:39.100673437 +0000 UTC m=+0.631303888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.103339 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 13:11:24.413563756 +0000 UTC Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.103434 4806 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.117873 4806 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 15:20:41 crc kubenswrapper[4806]: E0217 15:20:41.118836 4806 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.175669 4806 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5" exitCode=0 Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.175780 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.175825 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.176964 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.176999 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.177013 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.179056 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.179092 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.179103 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.179114 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.179259 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.180879 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90" exitCode=0 Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.180961 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.180991 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.181037 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.181054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.181122 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.182743 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.182791 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.182813 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.183990 4806 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff" exitCode=0 Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.184048 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.184149 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.185245 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.185288 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.185307 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.188532 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.191935 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.191982 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.191998 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.192106 4806 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="7c0033448df4fca90c10dfb3081a8123d27f6e6bdbc04fd0b1fa38afa99ad61c" exitCode=0 Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.192166 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"7c0033448df4fca90c10dfb3081a8123d27f6e6bdbc04fd0b1fa38afa99ad61c"} Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.192305 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.193535 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.193586 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.193605 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:41 crc kubenswrapper[4806]: I0217 15:20:41.832916 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.103444 4806 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.104443 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 13:08:39.776683363 +0000 UTC Feb 17 15:20:42 crc kubenswrapper[4806]: E0217 15:20:42.110186 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="3.2s" Feb 17 15:20:42 crc kubenswrapper[4806]: W0217 15:20:42.161781 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:42 crc kubenswrapper[4806]: E0217 15:20:42.161846 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.197234 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.197287 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.197302 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.197315 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.198857 4806 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2" exitCode=0 Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.198914 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.199009 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.199953 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.199986 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.200006 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.202746 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fcea55ff5624fac9013d44147b26633abcc7e4a3d76b66d290aa7da2df4346ed"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.203054 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.205991 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.206024 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.206035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.207284 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.207608 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.207629 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.207990 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.208180 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941"} Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.212851 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.212886 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.212899 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.213604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.213627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.213637 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.351071 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.352094 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.352123 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.352132 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:42 crc kubenswrapper[4806]: I0217 15:20:42.352154 4806 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 15:20:42 crc kubenswrapper[4806]: E0217 15:20:42.352532 4806 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.12:6443: connect: connection refused" node="crc" Feb 17 15:20:42 crc kubenswrapper[4806]: W0217 15:20:42.597261 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.12:6443: connect: connection refused Feb 17 15:20:42 crc kubenswrapper[4806]: E0217 15:20:42.597372 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.12:6443: connect: connection refused" logger="UnhandledError" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.104593 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 06:44:10.993510296 +0000 UTC Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.215265 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53"} Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.215464 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.216888 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.216917 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.216929 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.218902 4806 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f" exitCode=0 Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.218956 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f"} Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.219066 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.219090 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.219109 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.219232 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.219093 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.220741 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.220788 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.220797 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.220876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.220922 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.220947 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.221113 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.221170 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.221195 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.221555 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.221585 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:43 crc kubenswrapper[4806]: I0217 15:20:43.221596 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.104710 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 06:32:23.908824589 +0000 UTC Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.229806 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.229884 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b"} Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.229963 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.229982 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4"} Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.229998 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d"} Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.230012 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55"} Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.230010 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.231367 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.231426 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.231367 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.231444 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.231465 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.231486 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:44 crc kubenswrapper[4806]: I0217 15:20:44.526896 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.105579 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 00:21:39.96550852 +0000 UTC Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.190194 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.190474 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.192481 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.192546 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.192565 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.238468 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359"} Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.238612 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.238609 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.239844 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.239921 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.239932 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.240160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.240213 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.240233 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.300606 4806 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.543124 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.553369 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.555849 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.555908 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.555925 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:45 crc kubenswrapper[4806]: I0217 15:20:45.555963 4806 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.106500 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 05:56:07.092887635 +0000 UTC Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.240508 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.240549 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.241435 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.241502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.241513 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.241510 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.241546 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:46 crc kubenswrapper[4806]: I0217 15:20:46.241561 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.107344 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 23:04:21.955910313 +0000 UTC Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.243147 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.244314 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.244374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.244391 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.447987 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.448290 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.449930 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.449991 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:47 crc kubenswrapper[4806]: I0217 15:20:47.450014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:48 crc kubenswrapper[4806]: I0217 15:20:48.107725 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 11:38:45.542987198 +0000 UTC Feb 17 15:20:48 crc kubenswrapper[4806]: I0217 15:20:48.278210 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:48 crc kubenswrapper[4806]: I0217 15:20:48.278382 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:48 crc kubenswrapper[4806]: I0217 15:20:48.280896 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:48 crc kubenswrapper[4806]: I0217 15:20:48.280995 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:48 crc kubenswrapper[4806]: I0217 15:20:48.281026 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:48 crc kubenswrapper[4806]: I0217 15:20:48.290994 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.108701 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 16:33:07.674159895 +0000 UTC Feb 17 15:20:49 crc kubenswrapper[4806]: E0217 15:20:49.244838 4806 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.248147 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.249395 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.249464 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.249484 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.346574 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.346918 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.348649 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.348715 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:49 crc kubenswrapper[4806]: I0217 15:20:49.348739 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:50 crc kubenswrapper[4806]: I0217 15:20:50.109498 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 00:59:30.36798761 +0000 UTC Feb 17 15:20:50 crc kubenswrapper[4806]: I0217 15:20:50.277347 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:50 crc kubenswrapper[4806]: I0217 15:20:50.277652 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:50 crc kubenswrapper[4806]: I0217 15:20:50.279357 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:50 crc kubenswrapper[4806]: I0217 15:20:50.279433 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:50 crc kubenswrapper[4806]: I0217 15:20:50.279445 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:50 crc kubenswrapper[4806]: I0217 15:20:50.284053 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:20:51 crc kubenswrapper[4806]: I0217 15:20:51.109868 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:59:55.342025052 +0000 UTC Feb 17 15:20:51 crc kubenswrapper[4806]: I0217 15:20:51.253223 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:51 crc kubenswrapper[4806]: I0217 15:20:51.254357 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:51 crc kubenswrapper[4806]: I0217 15:20:51.254445 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:51 crc kubenswrapper[4806]: I0217 15:20:51.254462 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:52 crc kubenswrapper[4806]: I0217 15:20:52.110090 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 11:44:30.324566914 +0000 UTC Feb 17 15:20:52 crc kubenswrapper[4806]: I0217 15:20:52.898235 4806 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52456->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 15:20:52 crc kubenswrapper[4806]: I0217 15:20:52.898311 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52456->192.168.126.11:17697: read: connection reset by peer" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.059932 4806 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.059998 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.103820 4806 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.111035 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:51:16.129947287 +0000 UTC Feb 17 15:20:53 crc kubenswrapper[4806]: W0217 15:20:53.183731 4806 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.183870 4806 trace.go:236] Trace[465123099]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 15:20:43.181) (total time: 10001ms): Feb 17 15:20:53 crc kubenswrapper[4806]: Trace[465123099]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:20:53.183) Feb 17 15:20:53 crc kubenswrapper[4806]: Trace[465123099]: [10.001913759s] [10.001913759s] END Feb 17 15:20:53 crc kubenswrapper[4806]: E0217 15:20:53.183904 4806 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.260444 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.262958 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53" exitCode=255 Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.263012 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53"} Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.263209 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.265482 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.265589 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.265618 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.267643 4806 scope.go:117] "RemoveContainer" containerID="31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.277475 4806 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.277563 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.513509 4806 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.513583 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.523505 4806 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 17 15:20:53 crc kubenswrapper[4806]: I0217 15:20:53.523837 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.111917 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 12:34:30.282613465 +0000 UTC Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.270878 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.275001 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b"} Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.275163 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.276119 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.276194 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.276230 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.537714 4806 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]log ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]etcd ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/priority-and-fairness-filter ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-apiextensions-informers ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-apiextensions-controllers ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/crd-informer-synced ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-system-namespaces-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 17 15:20:54 crc kubenswrapper[4806]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/bootstrap-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/start-kube-aggregator-informers ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/apiservice-registration-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/apiservice-discovery-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]autoregister-completion ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/apiservice-openapi-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 17 15:20:54 crc kubenswrapper[4806]: livez check failed Feb 17 15:20:54 crc kubenswrapper[4806]: I0217 15:20:54.537771 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:20:55 crc kubenswrapper[4806]: I0217 15:20:55.112535 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 18:00:02.984057701 +0000 UTC Feb 17 15:20:56 crc kubenswrapper[4806]: I0217 15:20:56.113891 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 01:27:48.236124856 +0000 UTC Feb 17 15:20:57 crc kubenswrapper[4806]: I0217 15:20:57.114323 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 02:21:15.909205942 +0000 UTC Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.115199 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 01:55:21.710994248 +0000 UTC Feb 17 15:20:58 crc kubenswrapper[4806]: E0217 15:20:58.514188 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.516764 4806 trace.go:236] Trace[1436010762]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 15:20:46.729) (total time: 11787ms): Feb 17 15:20:58 crc kubenswrapper[4806]: Trace[1436010762]: ---"Objects listed" error: 11787ms (15:20:58.516) Feb 17 15:20:58 crc kubenswrapper[4806]: Trace[1436010762]: [11.787217396s] [11.787217396s] END Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.516805 4806 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.517985 4806 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.526741 4806 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.526768 4806 trace.go:236] Trace[2040499481]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 15:20:43.602) (total time: 14924ms): Feb 17 15:20:58 crc kubenswrapper[4806]: Trace[2040499481]: ---"Objects listed" error: 14924ms (15:20:58.526) Feb 17 15:20:58 crc kubenswrapper[4806]: Trace[2040499481]: [14.924331181s] [14.924331181s] END Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.526796 4806 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 15:20:58 crc kubenswrapper[4806]: E0217 15:20:58.527244 4806 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.527347 4806 trace.go:236] Trace[1099970659]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 15:20:46.735) (total time: 11791ms): Feb 17 15:20:58 crc kubenswrapper[4806]: Trace[1099970659]: ---"Objects listed" error: 11791ms (15:20:58.527) Feb 17 15:20:58 crc kubenswrapper[4806]: Trace[1099970659]: [11.791648836s] [11.791648836s] END Feb 17 15:20:58 crc kubenswrapper[4806]: I0217 15:20:58.527681 4806 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.102748 4806 apiserver.go:52] "Watching apiserver" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.109229 4806 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.109650 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.110195 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.110206 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.110655 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.110709 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.110769 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.110797 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.110851 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.110872 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.110323 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.115443 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 23:40:01.111515113 +0000 UTC Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.115865 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.117884 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.118169 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.118439 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.118923 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.122858 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.123153 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.123312 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.123321 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.157849 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.179706 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.194233 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.195097 4806 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.211976 4806 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.212273 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.230953 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231006 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231033 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231070 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231092 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231112 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231137 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231158 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231181 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231203 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231228 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231256 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231276 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231296 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231316 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231335 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231360 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231382 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231423 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231452 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231476 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231498 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231520 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231542 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231564 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231583 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231633 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231654 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231676 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231697 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231719 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231743 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231766 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231790 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231813 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231835 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231858 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231880 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231900 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231921 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231942 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231965 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.231989 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232025 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232063 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232086 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232107 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232129 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232152 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232174 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232195 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232217 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232238 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232260 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232283 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232306 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232327 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232349 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232376 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232413 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232438 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232461 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232484 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232507 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232534 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232557 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232579 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232600 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232625 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232648 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232670 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232695 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232719 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232742 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232815 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232839 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232860 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232882 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232905 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232925 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232948 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232970 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.232992 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233015 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233040 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233065 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233089 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233112 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233133 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233167 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233191 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233213 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233234 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233257 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233279 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233300 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233324 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233346 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233369 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233391 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233428 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233451 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233475 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233497 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233542 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233565 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233591 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233612 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233634 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233657 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233679 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233701 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233726 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233747 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233770 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233792 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233816 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233838 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233862 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233887 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233911 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233943 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233965 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.233988 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234013 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234037 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234059 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234082 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234106 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234129 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234151 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234174 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234197 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234220 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234245 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234269 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234376 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.234654 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235083 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235116 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235140 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235165 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235188 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235212 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235237 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235263 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235289 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235313 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235337 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235361 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235387 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235433 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235459 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235482 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235507 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235530 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235555 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235579 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235602 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235626 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235650 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235675 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235699 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235721 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235746 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235771 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235795 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235820 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235847 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235872 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235897 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235926 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235951 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235974 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.235998 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236023 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236049 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236075 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236098 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236121 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236147 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236172 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236197 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236220 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236243 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236267 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236291 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236318 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236341 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236369 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236393 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236502 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236528 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236555 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236579 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236603 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236651 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236679 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236704 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236730 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236758 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236800 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236830 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236858 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236885 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236913 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236938 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236962 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.236987 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.237013 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.237209 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.237480 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.237708 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.238779 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.239106 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.239482 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.239833 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.240040 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.240191 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.240534 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.240557 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.240679 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.241720 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.241951 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.244127 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.244223 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.244495 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.244736 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.244997 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.245144 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.245906 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.246080 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.246259 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.250647 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.251249 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.251272 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.251538 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.251592 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.251827 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.251902 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.251932 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.244205 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.252471 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.252704 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.252925 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.253366 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.253674 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.253735 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.252279 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.254519 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.254876 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.255205 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.255547 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.256029 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.256447 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.256461 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.256665 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.257114 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.257241 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.257575 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.257698 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.258395 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.258475 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.258886 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.258952 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.259180 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.259439 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.259674 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.260496 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.260952 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.260969 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.261504 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.261771 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.261890 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.262244 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.262313 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.262528 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.256867 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.263046 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.263041 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.263756 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.263784 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.264090 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.264297 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.264504 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.264370 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.265337 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.265366 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.265635 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:20:59.765590834 +0000 UTC m=+21.296221265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.265711 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.266703 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.266789 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.266787 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.267676 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.267973 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.268196 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.269988 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.269959 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.270487 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.270921 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.256606 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.270979 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.271382 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.272144 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.272959 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.273455 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.273665 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.273712 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.274140 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.274374 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.274472 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.274620 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.274949 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.275339 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.275708 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.275793 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.275848 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.276343 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.276341 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.276438 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.278371 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.278523 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.278522 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.279774 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.280395 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.280498 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.280651 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.280694 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.281147 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.281345 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.281576 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.281703 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.281823 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.281903 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.282458 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.283197 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.283626 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.283939 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.284335 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.284537 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.284612 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.284821 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.285030 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.285215 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.286079 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.286485 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.286589 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.286640 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.286727 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.286910 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.287091 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.287288 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.287392 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.288316 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.289917 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.290272 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:20:59.790207845 +0000 UTC m=+21.320838266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.291037 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.291084 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.291834 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.291974 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.292819 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.292833 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.292856 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.292988 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.293075 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.293079 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.293120 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:20:59.792921553 +0000 UTC m=+21.323551974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.294072 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.294088 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.294154 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.294181 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.294980 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.295000 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.295221 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.295474 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.295909 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.296018 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.296201 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.297627 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.297693 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.298800 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.298970 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.299003 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.299114 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.299602 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.300008 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.300297 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.300875 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.301011 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.301084 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.301365 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.301608 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.301682 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.301853 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.301941 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.302105 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.294528 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.302383 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.302921 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.303834 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.304253 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.306575 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.309730 4806 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.309834 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.310377 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.310706 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.311139 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.332965 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.337105 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.337850 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.337935 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.338000 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.338112 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:20:59.838093246 +0000 UTC m=+21.368723657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.338635 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.338823 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.338984 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.339031 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.339823 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340203 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340281 4806 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340337 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340394 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340488 4806 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340555 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340609 4806 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340665 4806 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340716 4806 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340766 4806 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340823 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340882 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340937 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.340998 4806 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.341055 4806 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.341112 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345446 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345541 4806 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345609 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345662 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345717 4806 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345771 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345826 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345881 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345934 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345988 4806 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346042 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346097 4806 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346148 4806 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346197 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346250 4806 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346300 4806 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346353 4806 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346428 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346490 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346541 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346590 4806 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346645 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346699 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346755 4806 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346811 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346865 4806 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346918 4806 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.346972 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347027 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347083 4806 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347138 4806 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347191 4806 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347247 4806 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347301 4806 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347358 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347421 4806 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347482 4806 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347542 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347593 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347647 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347701 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347752 4806 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347800 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347857 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347911 4806 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.347970 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348021 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348073 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348138 4806 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348189 4806 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348241 4806 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348295 4806 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348349 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348415 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348478 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348530 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348587 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348642 4806 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348697 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348751 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348808 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348857 4806 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348911 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.348966 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349017 4806 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349071 4806 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349130 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349184 4806 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349241 4806 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349292 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349345 4806 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349415 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349477 4806 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349536 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349587 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349649 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349701 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349750 4806 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349799 4806 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349852 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349904 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.349975 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.350033 4806 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.350087 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.350143 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.350196 4806 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.350250 4806 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.350322 4806 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.350387 4806 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.358672 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.358816 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359168 4806 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359244 4806 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359308 4806 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359369 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359451 4806 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359521 4806 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359600 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359675 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359736 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359802 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359863 4806 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.359944 4806 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360014 4806 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360075 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360143 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360214 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360273 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360337 4806 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360438 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360515 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360585 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360668 4806 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360894 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.360975 4806 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361043 4806 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361111 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361187 4806 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361260 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361330 4806 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361412 4806 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361487 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361561 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361626 4806 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361694 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361764 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361823 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361885 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.361948 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362010 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362078 4806 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362145 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362203 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362265 4806 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362332 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362397 4806 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362484 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362560 4806 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362626 4806 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362696 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362753 4806 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362835 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.356947 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.360868 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.345337 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.362966 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.362983 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362972 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.358759 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.363050 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:20:59.863024535 +0000 UTC m=+21.393654946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.362897 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363091 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363103 4806 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363112 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363123 4806 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363112 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363136 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363251 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363269 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363288 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363308 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363322 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363333 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363343 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363354 4806 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363365 4806 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363375 4806 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363385 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363395 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363436 4806 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363455 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363470 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363487 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363501 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363514 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363530 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363543 4806 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363555 4806 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363570 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363584 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363598 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.363613 4806 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.365828 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.371161 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.375532 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.378778 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.380853 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.393760 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.400988 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.408415 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.415100 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.419108 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.428765 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.438991 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.444276 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: W0217 15:20:59.453734 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-ffa22cfcb2915538240d121d59f824ec48b868ce80794c394bc4af4b8bf77c22 WatchSource:0}: Error finding container ffa22cfcb2915538240d121d59f824ec48b868ce80794c394bc4af4b8bf77c22: Status 404 returned error can't find the container with id ffa22cfcb2915538240d121d59f824ec48b868ce80794c394bc4af4b8bf77c22 Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.454284 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.462524 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.464644 4806 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.464666 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.464679 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.464692 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.464701 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.474073 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.476026 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.485007 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: W0217 15:20:59.500541 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-fa3ce9a65e705c33d6b06ed5b17b9c561a6d0788ae36e9422a2dd4e1224ba51e WatchSource:0}: Error finding container fa3ce9a65e705c33d6b06ed5b17b9c561a6d0788ae36e9422a2dd4e1224ba51e: Status 404 returned error can't find the container with id fa3ce9a65e705c33d6b06ed5b17b9c561a6d0788ae36e9422a2dd4e1224ba51e Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.512283 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.523346 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.532390 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.533073 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.534444 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.541748 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.546344 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.549527 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.558697 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.570961 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.581335 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.590984 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:52Z\\\",\\\"message\\\":\\\"W0217 15:20:42.348689 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 15:20:42.349051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771341642 cert, and key in /tmp/serving-cert-1089255461/serving-signer.crt, /tmp/serving-cert-1089255461/serving-signer.key\\\\nI0217 15:20:42.593999 1 observer_polling.go:159] Starting file observer\\\\nW0217 15:20:42.597778 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 15:20:42.597910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:42.600222 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1089255461/tls.crt::/tmp/serving-cert-1089255461/tls.key\\\\\\\"\\\\nF0217 15:20:52.891895 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.599442 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.609617 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.620994 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.638022 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.649111 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.659424 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.685366 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.711589 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:52Z\\\",\\\"message\\\":\\\"W0217 15:20:42.348689 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 15:20:42.349051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771341642 cert, and key in /tmp/serving-cert-1089255461/serving-signer.crt, /tmp/serving-cert-1089255461/serving-signer.key\\\\nI0217 15:20:42.593999 1 observer_polling.go:159] Starting file observer\\\\nW0217 15:20:42.597778 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 15:20:42.597910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:42.600222 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1089255461/tls.crt::/tmp/serving-cert-1089255461/tls.key\\\\\\\"\\\\nF0217 15:20:52.891895 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.733598 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.755669 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.766927 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.767088 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:21:00.767056466 +0000 UTC m=+22.297686907 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.769468 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.787611 4806 csr.go:261] certificate signing request csr-tq269 is approved, waiting to be issued Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.794572 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.801119 4806 csr.go:257] certificate signing request csr-tq269 is issued Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.810952 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.827476 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.836484 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tjnkx"] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.836987 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.840427 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.842690 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.842974 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.843072 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.843166 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.853912 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.867730 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.867780 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.867812 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.867837 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.867989 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.867992 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868034 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868029 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868049 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868051 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868069 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:00.868051056 +0000 UTC m=+22.398681467 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868076 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868241 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:00.86821932 +0000 UTC m=+22.398849731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868241 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868263 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:00.868253421 +0000 UTC m=+22.398883832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: E0217 15:20:59.868342 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:00.868294982 +0000 UTC m=+22.398925463 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.868602 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.886919 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.917130 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.927563 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.946693 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:52Z\\\",\\\"message\\\":\\\"W0217 15:20:42.348689 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 15:20:42.349051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771341642 cert, and key in /tmp/serving-cert-1089255461/serving-signer.crt, /tmp/serving-cert-1089255461/serving-signer.key\\\\nI0217 15:20:42.593999 1 observer_polling.go:159] Starting file observer\\\\nW0217 15:20:42.597778 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 15:20:42.597910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:42.600222 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1089255461/tls.crt::/tmp/serving-cert-1089255461/tls.key\\\\\\\"\\\\nF0217 15:20:52.891895 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.960216 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.968903 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzxf9\" (UniqueName: \"kubernetes.io/projected/aff0cd70-eca5-4222-85b8-dd4543122e01-kube-api-access-lzxf9\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.968933 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aff0cd70-eca5-4222-85b8-dd4543122e01-serviceca\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.968954 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff0cd70-eca5-4222-85b8-dd4543122e01-host\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.976102 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:20:59 crc kubenswrapper[4806]: I0217 15:20:59.997797 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.070325 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzxf9\" (UniqueName: \"kubernetes.io/projected/aff0cd70-eca5-4222-85b8-dd4543122e01-kube-api-access-lzxf9\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.070363 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aff0cd70-eca5-4222-85b8-dd4543122e01-serviceca\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.070382 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff0cd70-eca5-4222-85b8-dd4543122e01-host\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.070473 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff0cd70-eca5-4222-85b8-dd4543122e01-host\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.071534 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aff0cd70-eca5-4222-85b8-dd4543122e01-serviceca\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.089787 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzxf9\" (UniqueName: \"kubernetes.io/projected/aff0cd70-eca5-4222-85b8-dd4543122e01-kube-api-access-lzxf9\") pod \"node-ca-tjnkx\" (UID: \"aff0cd70-eca5-4222-85b8-dd4543122e01\") " pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.115721 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 01:39:15.403800091 +0000 UTC Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.148229 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tjnkx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.278972 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lvlwv"] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.279262 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.280644 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jwndx"] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.280981 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.281061 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.281166 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.283126 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.283889 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.285000 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.285731 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.286105 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.286278 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.286424 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.287826 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.302511 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.317461 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.317681 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.317996 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.320107 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b" exitCode=255 Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.320169 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.320213 4806 scope.go:117] "RemoveContainer" containerID="31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.320616 4806 scope.go:117] "RemoveContainer" containerID="c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.320773 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.321632 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.323102 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tjnkx" event={"ID":"aff0cd70-eca5-4222-85b8-dd4543122e01","Type":"ContainerStarted","Data":"66e9f5fd285bda0268507fadc9a71bb99127d20c387de5c386711e3ea2ed81e4"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.324001 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.324028 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fa3ce9a65e705c33d6b06ed5b17b9c561a6d0788ae36e9422a2dd4e1224ba51e"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.325544 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.325568 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.325578 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"12f395e4a50d729c69e5826d731f2565d234b3e09e4dd7f47738a300b6eda40d"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.326973 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ffa22cfcb2915538240d121d59f824ec48b868ce80794c394bc4af4b8bf77c22"} Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.338795 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.339427 4806 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.374427 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e61ad46f-e059-42a8-a36b-cf791e3bf196-hosts-file\") pod \"node-resolver-lvlwv\" (UID: \"e61ad46f-e059-42a8-a36b-cf791e3bf196\") " pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.374469 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbp9l\" (UniqueName: \"kubernetes.io/projected/e61ad46f-e059-42a8-a36b-cf791e3bf196-kube-api-access-cbp9l\") pod \"node-resolver-lvlwv\" (UID: \"e61ad46f-e059-42a8-a36b-cf791e3bf196\") " pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.374507 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888ccee0-4c6b-45ea-9d8c-00668327ca0d-proxy-tls\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.374527 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rw9l\" (UniqueName: \"kubernetes.io/projected/888ccee0-4c6b-45ea-9d8c-00668327ca0d-kube-api-access-4rw9l\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.374569 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/888ccee0-4c6b-45ea-9d8c-00668327ca0d-rootfs\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.374586 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/888ccee0-4c6b-45ea-9d8c-00668327ca0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.378766 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.399146 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.427651 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.446013 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:52Z\\\",\\\"message\\\":\\\"W0217 15:20:42.348689 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 15:20:42.349051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771341642 cert, and key in /tmp/serving-cert-1089255461/serving-signer.crt, /tmp/serving-cert-1089255461/serving-signer.key\\\\nI0217 15:20:42.593999 1 observer_polling.go:159] Starting file observer\\\\nW0217 15:20:42.597778 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 15:20:42.597910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:42.600222 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1089255461/tls.crt::/tmp/serving-cert-1089255461/tls.key\\\\\\\"\\\\nF0217 15:20:52.891895 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.459369 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475086 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475198 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e61ad46f-e059-42a8-a36b-cf791e3bf196-hosts-file\") pod \"node-resolver-lvlwv\" (UID: \"e61ad46f-e059-42a8-a36b-cf791e3bf196\") " pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475233 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbp9l\" (UniqueName: \"kubernetes.io/projected/e61ad46f-e059-42a8-a36b-cf791e3bf196-kube-api-access-cbp9l\") pod \"node-resolver-lvlwv\" (UID: \"e61ad46f-e059-42a8-a36b-cf791e3bf196\") " pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475302 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e61ad46f-e059-42a8-a36b-cf791e3bf196-hosts-file\") pod \"node-resolver-lvlwv\" (UID: \"e61ad46f-e059-42a8-a36b-cf791e3bf196\") " pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475314 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888ccee0-4c6b-45ea-9d8c-00668327ca0d-proxy-tls\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475355 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rw9l\" (UniqueName: \"kubernetes.io/projected/888ccee0-4c6b-45ea-9d8c-00668327ca0d-kube-api-access-4rw9l\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475373 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/888ccee0-4c6b-45ea-9d8c-00668327ca0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475424 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/888ccee0-4c6b-45ea-9d8c-00668327ca0d-rootfs\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.475651 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/888ccee0-4c6b-45ea-9d8c-00668327ca0d-rootfs\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.476785 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/888ccee0-4c6b-45ea-9d8c-00668327ca0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.482131 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888ccee0-4c6b-45ea-9d8c-00668327ca0d-proxy-tls\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.492598 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.495283 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbp9l\" (UniqueName: \"kubernetes.io/projected/e61ad46f-e059-42a8-a36b-cf791e3bf196-kube-api-access-cbp9l\") pod \"node-resolver-lvlwv\" (UID: \"e61ad46f-e059-42a8-a36b-cf791e3bf196\") " pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.496923 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rw9l\" (UniqueName: \"kubernetes.io/projected/888ccee0-4c6b-45ea-9d8c-00668327ca0d-kube-api-access-4rw9l\") pod \"machine-config-daemon-jwndx\" (UID: \"888ccee0-4c6b-45ea-9d8c-00668327ca0d\") " pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.513840 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.532849 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:52Z\\\",\\\"message\\\":\\\"W0217 15:20:42.348689 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 15:20:42.349051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771341642 cert, and key in /tmp/serving-cert-1089255461/serving-signer.crt, /tmp/serving-cert-1089255461/serving-signer.key\\\\nI0217 15:20:42.593999 1 observer_polling.go:159] Starting file observer\\\\nW0217 15:20:42.597778 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 15:20:42.597910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:42.600222 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1089255461/tls.crt::/tmp/serving-cert-1089255461/tls.key\\\\\\\"\\\\nF0217 15:20:52.891895 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.547905 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.570893 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.586326 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.597870 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.600465 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lvlwv" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.607534 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: W0217 15:21:00.613248 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61ad46f_e059_42a8_a36b_cf791e3bf196.slice/crio-0cc247f0267a995d00f12d8ea8335aab7bfc90d1517b8317ef6fea8de3463f7d WatchSource:0}: Error finding container 0cc247f0267a995d00f12d8ea8335aab7bfc90d1517b8317ef6fea8de3463f7d: Status 404 returned error can't find the container with id 0cc247f0267a995d00f12d8ea8335aab7bfc90d1517b8317ef6fea8de3463f7d Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.621222 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:21:00 crc kubenswrapper[4806]: W0217 15:21:00.638789 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888ccee0_4c6b_45ea_9d8c_00668327ca0d.slice/crio-28d5ada3a31363bcec033f5991fa77aa2cd01cbca220d47bffb88bb5d10c8b98 WatchSource:0}: Error finding container 28d5ada3a31363bcec033f5991fa77aa2cd01cbca220d47bffb88bb5d10c8b98: Status 404 returned error can't find the container with id 28d5ada3a31363bcec033f5991fa77aa2cd01cbca220d47bffb88bb5d10c8b98 Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.647567 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.664020 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.680417 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.692327 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-r9b8d"] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.692920 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: W0217 15:21:00.695183 4806 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 17 15:21:00 crc kubenswrapper[4806]: W0217 15:21:00.695252 4806 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.695248 4806 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.695297 4806 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 15:21:00 crc kubenswrapper[4806]: W0217 15:21:00.695287 4806 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.695348 4806 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 15:21:00 crc kubenswrapper[4806]: W0217 15:21:00.695182 4806 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.695382 4806 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 15:21:00 crc kubenswrapper[4806]: W0217 15:21:00.695200 4806 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.695431 4806 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.695564 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wgg2s"] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.695897 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2m855"] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.696085 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.696491 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.696817 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.698222 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.700220 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.704232 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.726553 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.746367 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.768663 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778064 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778150 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-netd\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778172 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-hostroot\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.778204 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.778179424 +0000 UTC m=+24.308809835 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778266 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-slash\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778289 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-systemd\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778311 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778329 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-env-overrides\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778347 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-socket-dir-parent\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778374 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-netns\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778389 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778419 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-cnibin\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778450 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-cni-bin\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778522 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778562 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-conf-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778614 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-script-lib\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778638 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-cni-binary-copy\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778655 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-netns\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778676 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-multus-certs\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778717 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778747 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-ovn\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778771 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-cnibin\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778793 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x24cx\" (UniqueName: \"kubernetes.io/projected/4a981fc6-90ce-4056-b041-a0089f3b40f8-kube-api-access-x24cx\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778817 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-var-lib-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778841 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-kubelet\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778867 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-system-cni-dir\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778888 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-binary-copy\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778908 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-kubelet\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778929 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-node-log\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778952 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-os-release\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.778977 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-log-socket\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779005 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-os-release\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779049 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779078 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-bin\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779101 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-cni-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779122 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-daemon-config\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779141 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5kbf\" (UniqueName: \"kubernetes.io/projected/344f8a87-e00f-4f0a-a0bc-aee197271160-kube-api-access-z5kbf\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779188 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-systemd-units\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779206 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-etc-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779223 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7gh\" (UniqueName: \"kubernetes.io/projected/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-kube-api-access-bk7gh\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779239 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-system-cni-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779254 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-config\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779270 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-cni-multus\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779287 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-k8s-cni-cncf-io\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779305 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-etc-kubernetes\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.779340 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovn-node-metrics-cert\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.788866 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.802540 4806 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 15:15:59 +0000 UTC, rotation deadline is 2026-11-12 02:35:35.097451587 +0000 UTC Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.802606 4806 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6419h14m34.294848296s for next certificate rotation Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.806683 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.824533 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.854023 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880700 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-cni-multus\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880750 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-config\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880774 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-k8s-cni-cncf-io\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880794 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovn-node-metrics-cert\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880817 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-etc-kubernetes\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880835 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-cni-multus\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880842 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-slash\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880895 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-systemd\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880916 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-netd\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880936 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-hostroot\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880960 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880986 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881008 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-netns\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881032 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881051 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-env-overrides\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881071 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-socket-dir-parent\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881094 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881114 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881136 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-cnibin\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881201 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-cni-bin\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881227 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881247 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-conf-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881266 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881284 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-script-lib\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881306 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881325 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-ovn\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881345 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-cni-binary-copy\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881363 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-netns\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881380 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-multus-certs\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881417 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-config\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881431 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-var-lib-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880872 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-slash\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881448 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-systemd\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881467 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-cnibin\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881483 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881489 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x24cx\" (UniqueName: \"kubernetes.io/projected/4a981fc6-90ce-4056-b041-a0089f3b40f8-kube-api-access-x24cx\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881494 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881511 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-kubelet\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881528 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-node-log\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881535 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-hostroot\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881540 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-cnibin\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881543 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-kubelet\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881565 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-kubelet\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881418 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-socket-dir-parent\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881572 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-system-cni-dir\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.881580 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881592 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-cnibin\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881596 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-binary-copy\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.881626 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.881611064 +0000 UTC m=+24.412241475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881642 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-os-release\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881665 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-log-socket\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.880897 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-k8s-cni-cncf-io\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881680 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-var-lib-cni-bin\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881686 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-os-release\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881702 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-kubelet\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881512 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-netd\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881708 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-daemon-config\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881731 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881744 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-bin\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.881751 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881759 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-cni-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.881767 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881772 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7gh\" (UniqueName: \"kubernetes.io/projected/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-kube-api-access-bk7gh\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.881778 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881788 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-system-cni-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881802 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5kbf\" (UniqueName: \"kubernetes.io/projected/344f8a87-e00f-4f0a-a0bc-aee197271160-kube-api-access-z5kbf\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.881810 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.881799319 +0000 UTC m=+24.412429730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881827 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-systemd-units\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881844 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-etc-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881862 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-node-log\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881867 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-etc-kubernetes\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881901 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-conf-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881880 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-etc-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881957 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-bin\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881969 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-cni-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881997 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-multus-certs\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.881992 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-host-run-netns\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882012 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-var-lib-openvswitch\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882026 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-systemd-units\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882050 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-ovn\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882060 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-netns\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.882126 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882140 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882147 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-system-cni-dir\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.882164 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.882151638 +0000 UTC m=+24.412782079 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882240 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-system-cni-dir\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882255 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-multus-daemon-config\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882247 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-log-socket\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882395 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-env-overrides\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882496 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882520 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a981fc6-90ce-4056-b041-a0089f3b40f8-os-release\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.882560 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.882575 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.882585 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882601 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/344f8a87-e00f-4f0a-a0bc-aee197271160-os-release\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:00 crc kubenswrapper[4806]: E0217 15:21:00.882617 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.882607909 +0000 UTC m=+24.413238420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.882694 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-script-lib\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.886365 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovn-node-metrics-cert\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.898593 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.958474 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7gh\" (UniqueName: \"kubernetes.io/projected/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-kube-api-access-bk7gh\") pod \"ovnkube-node-2m855\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:00 crc kubenswrapper[4806]: I0217 15:21:00.996804 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:00Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.033776 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.033774 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.073743 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.116484 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 14:13:32.983831105 +0000 UTC Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.116734 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.157442 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.160250 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.160285 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.160324 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.160390 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.160493 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.160575 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.169935 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.170513 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.171994 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.172895 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.173954 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.177327 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.178622 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.179440 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.180749 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.181451 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.182836 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.183768 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.184981 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.185520 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.186438 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.187117 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.189448 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.190982 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.192236 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.193071 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.193195 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.193764 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.195056 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.195675 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.196888 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.197392 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.199384 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.200270 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.201278 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.201929 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.202548 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.203366 4806 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.203501 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.205360 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.206278 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.206840 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.208780 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.209930 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.210528 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.211557 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.212377 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.213103 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.214225 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.215599 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.216336 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.217512 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.218340 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.219734 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.220734 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.221908 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.222395 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.222889 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.223880 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.224680 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.225715 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.234196 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.278436 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31ae80ae53885a2dd4a06f1f624ad08f22f031e103a6a25a1c53b1025a776c53\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:52Z\\\",\\\"message\\\":\\\"W0217 15:20:42.348689 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 15:20:42.349051 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771341642 cert, and key in /tmp/serving-cert-1089255461/serving-signer.crt, /tmp/serving-cert-1089255461/serving-signer.key\\\\nI0217 15:20:42.593999 1 observer_polling.go:159] Starting file observer\\\\nW0217 15:20:42.597778 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 15:20:42.597910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:42.600222 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1089255461/tls.crt::/tmp/serving-cert-1089255461/tls.key\\\\\\\"\\\\nF0217 15:20:52.891895 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.323575 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.329728 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lvlwv" event={"ID":"e61ad46f-e059-42a8-a36b-cf791e3bf196","Type":"ContainerStarted","Data":"74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.329774 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lvlwv" event={"ID":"e61ad46f-e059-42a8-a36b-cf791e3bf196","Type":"ContainerStarted","Data":"0cc247f0267a995d00f12d8ea8335aab7bfc90d1517b8317ef6fea8de3463f7d"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.330900 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1" exitCode=0 Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.330943 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.330992 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"33663b71273cc60bd5ccd7cadbcb1760143995fe20b7c9bbe4419013de836616"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.332754 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.332791 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.332810 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"28d5ada3a31363bcec033f5991fa77aa2cd01cbca220d47bffb88bb5d10c8b98"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.334182 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.336433 4806 scope.go:117] "RemoveContainer" containerID="c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b" Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.336593 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.337589 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tjnkx" event={"ID":"aff0cd70-eca5-4222-85b8-dd4543122e01","Type":"ContainerStarted","Data":"aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245"} Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.354283 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.410587 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.449926 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.477521 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.556698 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.571176 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.603526 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.635645 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.675425 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.717932 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.754904 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.795735 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.840512 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.877045 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.882750 4806 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.883044 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-cni-binary-copy podName:344f8a87-e00f-4f0a-a0bc-aee197271160 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.383022021 +0000 UTC m=+23.913652442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-cni-binary-copy") pod "multus-wgg2s" (UID: "344f8a87-e00f-4f0a-a0bc-aee197271160") : failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.882749 4806 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.882770 4806 configmap.go:193] Couldn't get configMap openshift-multus/cni-copy-resources: failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.883232 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-sysctl-allowlist podName:4a981fc6-90ce-4056-b041-a0089f3b40f8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.383220066 +0000 UTC m=+23.913850477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-r9b8d" (UID: "4a981fc6-90ce-4056-b041-a0089f3b40f8") : failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.883275 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-binary-copy podName:4a981fc6-90ce-4056-b041-a0089f3b40f8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.383265707 +0000 UTC m=+23.913896108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-binary-copy" (UniqueName: "kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-binary-copy") pod "multus-additional-cni-plugins-r9b8d" (UID: "4a981fc6-90ce-4056-b041-a0089f3b40f8") : failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.884784 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.906258 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.918959 4806 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.919305 4806 projected.go:194] Error preparing data for projected volume kube-api-access-x24cx for pod openshift-multus/multus-additional-cni-plugins-r9b8d: failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.919485 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a981fc6-90ce-4056-b041-a0089f3b40f8-kube-api-access-x24cx podName:4a981fc6-90ce-4056-b041-a0089f3b40f8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.419462387 +0000 UTC m=+23.950092798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x24cx" (UniqueName: "kubernetes.io/projected/4a981fc6-90ce-4056-b041-a0089f3b40f8-kube-api-access-x24cx") pod "multus-additional-cni-plugins-r9b8d" (UID: "4a981fc6-90ce-4056-b041-a0089f3b40f8") : failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.936129 4806 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.936435 4806 projected.go:194] Error preparing data for projected volume kube-api-access-z5kbf for pod openshift-multus/multus-wgg2s: failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: E0217 15:21:01.936573 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/344f8a87-e00f-4f0a-a0bc-aee197271160-kube-api-access-z5kbf podName:344f8a87-e00f-4f0a-a0bc-aee197271160 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:02.436552892 +0000 UTC m=+23.967183303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z5kbf" (UniqueName: "kubernetes.io/projected/344f8a87-e00f-4f0a-a0bc-aee197271160-kube-api-access-z5kbf") pod "multus-wgg2s" (UID: "344f8a87-e00f-4f0a-a0bc-aee197271160") : failed to sync configmap cache: timed out waiting for the condition Feb 17 15:21:01 crc kubenswrapper[4806]: I0217 15:21:01.955036 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:01Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.002461 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.034828 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.044829 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.094901 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.104535 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.117383 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 00:15:57.178762617 +0000 UTC Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.154709 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.191188 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.204939 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.342423 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b"} Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.346568 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.346598 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.346608 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.346618 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.346626 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.346634 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.361337 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.374981 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.387270 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.395554 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-cni-binary-copy\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.395600 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.395627 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-binary-copy\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.397024 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-binary-copy\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.397072 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4a981fc6-90ce-4056-b041-a0089f3b40f8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.397181 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/344f8a87-e00f-4f0a-a0bc-aee197271160-cni-binary-copy\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.400354 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.411577 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.456627 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.493788 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.496270 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5kbf\" (UniqueName: \"kubernetes.io/projected/344f8a87-e00f-4f0a-a0bc-aee197271160-kube-api-access-z5kbf\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.497367 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x24cx\" (UniqueName: \"kubernetes.io/projected/4a981fc6-90ce-4056-b041-a0089f3b40f8-kube-api-access-x24cx\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.503266 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5kbf\" (UniqueName: \"kubernetes.io/projected/344f8a87-e00f-4f0a-a0bc-aee197271160-kube-api-access-z5kbf\") pod \"multus-wgg2s\" (UID: \"344f8a87-e00f-4f0a-a0bc-aee197271160\") " pod="openshift-multus/multus-wgg2s" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.503425 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x24cx\" (UniqueName: \"kubernetes.io/projected/4a981fc6-90ce-4056-b041-a0089f3b40f8-kube-api-access-x24cx\") pod \"multus-additional-cni-plugins-r9b8d\" (UID: \"4a981fc6-90ce-4056-b041-a0089f3b40f8\") " pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.522481 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.528873 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wgg2s" Feb 17 15:21:02 crc kubenswrapper[4806]: W0217 15:21:02.544904 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a981fc6_90ce_4056_b041_a0089f3b40f8.slice/crio-c494e2c9d7a81fd6f9abb219de590ac7a558f2d423bfc6e054c3ddc93cb965fb WatchSource:0}: Error finding container c494e2c9d7a81fd6f9abb219de590ac7a558f2d423bfc6e054c3ddc93cb965fb: Status 404 returned error can't find the container with id c494e2c9d7a81fd6f9abb219de590ac7a558f2d423bfc6e054c3ddc93cb965fb Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.545930 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.579343 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.613178 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.651463 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.818897 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.819129 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:21:06.819112815 +0000 UTC m=+28.349743226 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.819182 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.819670 4806 scope.go:117] "RemoveContainer" containerID="c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b" Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.819795 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.863203 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.886087 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.918918 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.919562 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.919592 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.919623 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.919640 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919702 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919703 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919724 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919758 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919759 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919782 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919795 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919766 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:06.919753276 +0000 UTC m=+28.450383687 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919831 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919849 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:06.919833248 +0000 UTC m=+28.450463659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919969 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:06.919938451 +0000 UTC m=+28.450568872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:02 crc kubenswrapper[4806]: E0217 15:21:02.919996 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:06.919985852 +0000 UTC m=+28.450616273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:02 crc kubenswrapper[4806]: I0217 15:21:02.932783 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:02Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.118183 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 09:15:36.323231769 +0000 UTC Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.160676 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.160682 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:03 crc kubenswrapper[4806]: E0217 15:21:03.160826 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.160710 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:03 crc kubenswrapper[4806]: E0217 15:21:03.160907 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:03 crc kubenswrapper[4806]: E0217 15:21:03.161015 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.351753 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerStarted","Data":"1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc"} Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.352326 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerStarted","Data":"13da8efffeae9ce815b6f847c5685db73b858e215e76bcd81ed90f6887dd88b4"} Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.353878 4806 generic.go:334] "Generic (PLEG): container finished" podID="4a981fc6-90ce-4056-b041-a0089f3b40f8" containerID="c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61" exitCode=0 Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.354012 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerDied","Data":"c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61"} Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.354117 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerStarted","Data":"c494e2c9d7a81fd6f9abb219de590ac7a558f2d423bfc6e054c3ddc93cb965fb"} Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.373909 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.396967 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.421238 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.441662 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.459431 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.475924 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.493726 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.512004 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.524502 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.539700 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.555263 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.570281 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.583546 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.607110 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.621056 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.635771 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.648073 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.662769 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.679345 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.695764 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.710701 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.728190 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.751667 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.773722 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.817752 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.853219 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.893014 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.936777 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:03 crc kubenswrapper[4806]: I0217 15:21:03.976241 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:03Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.019160 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.119415 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:42:27.754532913 +0000 UTC Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.358938 4806 generic.go:334] "Generic (PLEG): container finished" podID="4a981fc6-90ce-4056-b041-a0089f3b40f8" containerID="b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8" exitCode=0 Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.359016 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerDied","Data":"b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8"} Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.366249 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.378550 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.400165 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.411356 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.422236 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.459728 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.512056 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.536533 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.551285 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.565835 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.586066 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.598582 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.609974 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.622561 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.637876 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.651464 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.927458 4806 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.929772 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.930108 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.930125 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.930254 4806 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.939112 4806 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.939419 4806 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.940988 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.941029 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.941039 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.941054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.941065 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:04Z","lastTransitionTime":"2026-02-17T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:04 crc kubenswrapper[4806]: E0217 15:21:04.954109 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.957885 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.957924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.957936 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.957953 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.957963 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:04Z","lastTransitionTime":"2026-02-17T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:04 crc kubenswrapper[4806]: E0217 15:21:04.970562 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.973605 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.973632 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.973639 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.973656 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.973666 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:04Z","lastTransitionTime":"2026-02-17T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:04 crc kubenswrapper[4806]: E0217 15:21:04.986695 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:04Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.990310 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.990334 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.990344 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.990358 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:04 crc kubenswrapper[4806]: I0217 15:21:04.990366 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:04Z","lastTransitionTime":"2026-02-17T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: E0217 15:21:05.002717 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.006557 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.006579 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.006588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.006604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.006613 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: E0217 15:21:05.019349 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: E0217 15:21:05.019508 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.020826 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.020873 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.020885 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.020899 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.020908 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.119883 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 02:49:49.149264649 +0000 UTC Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.123683 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.123720 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.123738 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.123760 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.123774 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.160134 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.160213 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.160213 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:05 crc kubenswrapper[4806]: E0217 15:21:05.160280 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:05 crc kubenswrapper[4806]: E0217 15:21:05.160430 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:05 crc kubenswrapper[4806]: E0217 15:21:05.160619 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.226347 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.226383 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.226393 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.226427 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.226440 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.329632 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.329676 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.329686 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.329701 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.329711 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.373747 4806 generic.go:334] "Generic (PLEG): container finished" podID="4a981fc6-90ce-4056-b041-a0089f3b40f8" containerID="f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838" exitCode=0 Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.373812 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerDied","Data":"f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.393965 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.411458 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.423250 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.432128 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.432187 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.432206 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.432233 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.432257 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.454541 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.474865 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.490606 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.509907 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.536002 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.536035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.536045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.536061 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.536072 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.541653 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.559734 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.583369 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.600360 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.613734 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.627377 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.639218 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.639268 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.639276 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.639293 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.639304 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.643576 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.655446 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:05Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.741964 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.742031 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.742050 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.742076 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.742096 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.845461 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.845545 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.845567 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.845592 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.845609 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.949167 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.949246 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.949291 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.949319 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:05 crc kubenswrapper[4806]: I0217 15:21:05.949337 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:05Z","lastTransitionTime":"2026-02-17T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.052846 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.052917 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.052936 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.052965 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.052986 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.120474 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 17:57:45.27415096 +0000 UTC Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.155922 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.155977 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.155994 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.156023 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.156041 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.259373 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.259461 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.259480 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.259507 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.259525 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.363984 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.364033 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.364045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.364072 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.364086 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.385738 4806 generic.go:334] "Generic (PLEG): container finished" podID="4a981fc6-90ce-4056-b041-a0089f3b40f8" containerID="a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff" exitCode=0 Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.385790 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerDied","Data":"a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.398145 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.399371 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.399464 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.403265 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.426076 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.437579 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.439021 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.448262 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.461967 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.466866 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.467012 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.467035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.467054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.467080 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.483671 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.494513 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.503769 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.527986 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.548002 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.559364 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.570232 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.570443 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.570463 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.570471 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.570486 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.570496 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.582970 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.586310 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.598671 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.615011 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.629362 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.644162 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.657437 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.673981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.674027 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.674041 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.674061 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.674070 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.675781 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.689776 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.705753 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.715990 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.730541 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.744192 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.755767 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.768349 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.776191 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.776234 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.776245 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.776264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.776275 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.780610 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.791260 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.801748 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.829710 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.841134 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.856681 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.856811 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:21:14.856784539 +0000 UTC m=+36.387414950 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.878578 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.878627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.878640 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.878658 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.878671 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958097 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958134 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958149 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.957972 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.958222 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.958274 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.958309 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958486 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958503 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958515 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958893 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.958987 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.959035 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:14.958555678 +0000 UTC m=+36.489186099 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.959053 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:14.959044541 +0000 UTC m=+36.489674972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.959072 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:14.959059611 +0000 UTC m=+36.489690032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:06 crc kubenswrapper[4806]: E0217 15:21:06.959085 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:14.959078911 +0000 UTC m=+36.489709332 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.981107 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.981158 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.981170 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.981191 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:06 crc kubenswrapper[4806]: I0217 15:21:06.981205 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:06Z","lastTransitionTime":"2026-02-17T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.083813 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.083845 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.083856 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.083872 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.083885 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.120836 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:11:54.33753797 +0000 UTC Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.160528 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.160598 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:07 crc kubenswrapper[4806]: E0217 15:21:07.160661 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:07 crc kubenswrapper[4806]: E0217 15:21:07.160782 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.160875 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:07 crc kubenswrapper[4806]: E0217 15:21:07.161040 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.187093 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.187157 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.187170 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.187193 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.187206 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.289940 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.290023 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.290045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.290075 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.290095 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.392845 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.392947 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.392971 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.393008 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.393036 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.406763 4806 generic.go:334] "Generic (PLEG): container finished" podID="4a981fc6-90ce-4056-b041-a0089f3b40f8" containerID="39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366" exitCode=0 Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.406841 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerDied","Data":"39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.420771 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.443190 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.455092 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.473770 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.488961 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.495471 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.495512 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.495525 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.495542 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.495551 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.505489 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.520641 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.537819 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.548786 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.563700 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.587699 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.605775 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.606830 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.606911 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.606926 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.606953 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.606968 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.629482 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.641724 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.657515 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:07Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.709264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.709298 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.709309 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.709326 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.709336 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.812665 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.812976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.813160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.813294 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.813481 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.916745 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.916792 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.916800 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.916817 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:07 crc kubenswrapper[4806]: I0217 15:21:07.916826 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:07Z","lastTransitionTime":"2026-02-17T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.020339 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.020439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.020458 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.020484 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.020502 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.121187 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 14:29:24.82117526 +0000 UTC Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.127849 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.127923 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.127944 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.127978 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.128000 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.230764 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.230819 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.230835 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.230856 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.230872 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.334368 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.334481 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.334504 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.334534 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.334555 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.421653 4806 generic.go:334] "Generic (PLEG): container finished" podID="4a981fc6-90ce-4056-b041-a0089f3b40f8" containerID="bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6" exitCode=0 Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.421735 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerDied","Data":"bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.437811 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.437860 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.437876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.437897 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.437912 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.443037 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.454781 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.480329 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.495432 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.512177 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.531322 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.541977 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.542030 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.542045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.542067 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.542081 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.549419 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.574504 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.600840 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.618683 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.633715 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.643930 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.643970 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.643981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.643998 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.644007 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.652139 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.665745 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.679101 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.690110 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:08Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.745849 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.745908 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.745924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.745950 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.745967 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.848085 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.848150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.848168 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.848195 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.848213 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.951868 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.951928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.951949 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.951973 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.951992 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:08Z","lastTransitionTime":"2026-02-17T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:08 crc kubenswrapper[4806]: I0217 15:21:08.966890 4806 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.056028 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.056100 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.056119 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.056145 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.056162 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.122031 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 10:47:43.821304454 +0000 UTC Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.159302 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.159348 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.159360 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.159377 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.159390 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.159957 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.160053 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.160245 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:09 crc kubenswrapper[4806]: E0217 15:21:09.160246 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:09 crc kubenswrapper[4806]: E0217 15:21:09.160336 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:09 crc kubenswrapper[4806]: E0217 15:21:09.160498 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.177531 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.190008 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.210244 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.229523 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.245564 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.262898 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.262944 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.262960 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.262981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.262995 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.265531 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.284569 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.299462 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.316388 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.333572 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.346157 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.366065 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.366164 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.366184 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.366218 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.366237 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.370668 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.384348 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.399636 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.413735 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.428051 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/0.log" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.433492 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35" exitCode=1 Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.433573 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.434465 4806 scope.go:117] "RemoveContainer" containerID="39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.438882 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" event={"ID":"4a981fc6-90ce-4056-b041-a0089f3b40f8","Type":"ContainerStarted","Data":"925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.451883 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.468967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.469253 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.469368 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.469522 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.469636 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.469078 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.484287 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.499674 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.513295 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.525617 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.538472 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.555941 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.568084 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.571857 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.571894 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.571903 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.571917 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.571925 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.581747 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.591037 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.600464 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.611126 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.628230 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.639278 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.653249 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.662158 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.674203 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.674230 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.674240 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.674255 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.674266 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.686812 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.697170 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.710521 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.725572 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.738539 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.752061 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.769152 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.777562 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.777605 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.777622 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.777696 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.777716 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.787645 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.815289 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.828890 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.840522 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.850200 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.862245 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.879975 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.880262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.880320 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.880379 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.880463 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.982945 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.982987 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.982998 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.983016 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:09 crc kubenswrapper[4806]: I0217 15:21:09.983029 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:09Z","lastTransitionTime":"2026-02-17T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.086327 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.086448 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.086464 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.086482 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.086493 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.122202 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 02:25:56.200669194 +0000 UTC Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.189648 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.189702 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.189715 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.189731 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.189748 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.292675 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.292743 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.292762 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.292791 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.292809 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.395885 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.395947 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.395967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.395992 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.396010 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.444481 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/1.log" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.445457 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/0.log" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.448548 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36" exitCode=1 Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.448601 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.448643 4806 scope.go:117] "RemoveContainer" containerID="39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.449534 4806 scope.go:117] "RemoveContainer" containerID="6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36" Feb 17 15:21:10 crc kubenswrapper[4806]: E0217 15:21:10.449731 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.467182 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.499264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.499329 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.499348 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.499374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.499393 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.505017 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.522280 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.535875 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.547174 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.560681 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.577307 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.602045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.602106 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.602127 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.602154 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.602171 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.607455 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.629453 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.652919 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.680515 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.699187 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.704111 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.704158 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.704170 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.704189 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.704201 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.717942 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.733327 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.747019 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:10Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.807165 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.807221 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.807238 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.807263 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.807280 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.910208 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.910306 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.910325 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.910353 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:10 crc kubenswrapper[4806]: I0217 15:21:10.910372 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:10Z","lastTransitionTime":"2026-02-17T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.014197 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.014258 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.014278 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.014316 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.014335 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.117702 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.117766 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.117784 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.117809 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.117826 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.123056 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 18:47:07.22445166 +0000 UTC Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.160836 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.160963 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.160993 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:11 crc kubenswrapper[4806]: E0217 15:21:11.161176 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:11 crc kubenswrapper[4806]: E0217 15:21:11.161316 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:11 crc kubenswrapper[4806]: E0217 15:21:11.161565 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.221465 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.221516 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.221535 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.221557 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.221574 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.324868 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.324932 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.324952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.324976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.324995 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.427665 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.427717 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.427729 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.427747 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.427762 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.454648 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/1.log" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.530346 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.530439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.530461 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.530489 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.530519 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.633899 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.633970 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.633991 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.634017 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.634035 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.737339 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.737439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.737459 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.737485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.737502 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.840803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.840867 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.840885 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.840912 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.840931 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.944949 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.945025 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.945052 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.945083 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:11 crc kubenswrapper[4806]: I0217 15:21:11.945107 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:11Z","lastTransitionTime":"2026-02-17T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.047984 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.048106 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.048131 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.048160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.048182 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.124154 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 01:20:00.662385365 +0000 UTC Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.151210 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.151265 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.151282 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.151305 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.151324 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.254392 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.254531 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.254556 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.254588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.254614 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.358096 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.358662 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.358695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.358727 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.358747 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.462252 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.462396 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.462500 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.462586 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.462615 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.566122 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.566185 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.566203 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.566226 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.566244 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.669255 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.669333 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.669350 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.669378 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.669396 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.772332 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.772377 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.772433 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.772451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.772494 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.876110 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.876190 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.876208 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.876234 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.876252 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.979627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.979716 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.979742 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.979775 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:12 crc kubenswrapper[4806]: I0217 15:21:12.979804 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:12Z","lastTransitionTime":"2026-02-17T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.082854 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.082911 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.082928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.082949 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.082966 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.125166 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 05:46:57.192191121 +0000 UTC Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.160043 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.160097 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:13 crc kubenswrapper[4806]: E0217 15:21:13.160228 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.160270 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:13 crc kubenswrapper[4806]: E0217 15:21:13.160380 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:13 crc kubenswrapper[4806]: E0217 15:21:13.160542 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.177641 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j"] Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.178476 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.182839 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.183056 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.185342 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.185448 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.185478 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.185510 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.185562 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.199892 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.221501 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.239358 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f73041c-6d45-4e20-b119-00a5feae4d58-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.239452 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f73041c-6d45-4e20-b119-00a5feae4d58-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.239517 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jnpn\" (UniqueName: \"kubernetes.io/projected/0f73041c-6d45-4e20-b119-00a5feae4d58-kube-api-access-4jnpn\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.239659 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f73041c-6d45-4e20-b119-00a5feae4d58-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.242110 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.262284 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.282937 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.289036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.289095 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.289117 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.289145 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.289162 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.299312 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.314619 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.340859 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jnpn\" (UniqueName: \"kubernetes.io/projected/0f73041c-6d45-4e20-b119-00a5feae4d58-kube-api-access-4jnpn\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.340936 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f73041c-6d45-4e20-b119-00a5feae4d58-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.341047 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f73041c-6d45-4e20-b119-00a5feae4d58-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.341085 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f73041c-6d45-4e20-b119-00a5feae4d58-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.342186 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f73041c-6d45-4e20-b119-00a5feae4d58-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.342364 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f73041c-6d45-4e20-b119-00a5feae4d58-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.347790 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.351113 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f73041c-6d45-4e20-b119-00a5feae4d58-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.366311 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.371731 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jnpn\" (UniqueName: \"kubernetes.io/projected/0f73041c-6d45-4e20-b119-00a5feae4d58-kube-api-access-4jnpn\") pod \"ovnkube-control-plane-749d76644c-jmb6j\" (UID: \"0f73041c-6d45-4e20-b119-00a5feae4d58\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.395947 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.395998 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.396012 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.396035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.396049 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.397108 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.424480 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.455325 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.473629 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.485151 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.499369 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.499433 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.499446 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.499465 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.499478 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.500019 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.500250 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: W0217 15:21:13.513925 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f73041c_6d45_4e20_b119_00a5feae4d58.slice/crio-c10053a59b7a5d8fb4a52da7b41f044297a66eafea3d75e85f6da411abc0ba29 WatchSource:0}: Error finding container c10053a59b7a5d8fb4a52da7b41f044297a66eafea3d75e85f6da411abc0ba29: Status 404 returned error can't find the container with id c10053a59b7a5d8fb4a52da7b41f044297a66eafea3d75e85f6da411abc0ba29 Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.517055 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:13Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.602855 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.602921 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.602935 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.602954 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.602970 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.706152 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.706187 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.706197 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.706212 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.706221 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.809749 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.809802 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.809814 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.809836 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.809850 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.912514 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.912556 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.912567 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.912585 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:13 crc kubenswrapper[4806]: I0217 15:21:13.912597 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:13Z","lastTransitionTime":"2026-02-17T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.015848 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.015937 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.015963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.015995 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.016019 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.118951 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.119028 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.119124 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.119216 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.119302 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.125538 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 22:07:46.102359865 +0000 UTC Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.227701 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.227758 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.227768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.227787 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.227802 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.331011 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.331087 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.331134 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.331169 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.331193 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.434584 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.434655 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.434675 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.434705 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.434723 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.473150 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" event={"ID":"0f73041c-6d45-4e20-b119-00a5feae4d58","Type":"ContainerStarted","Data":"b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.473210 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" event={"ID":"0f73041c-6d45-4e20-b119-00a5feae4d58","Type":"ContainerStarted","Data":"e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.473228 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" event={"ID":"0f73041c-6d45-4e20-b119-00a5feae4d58","Type":"ContainerStarted","Data":"c10053a59b7a5d8fb4a52da7b41f044297a66eafea3d75e85f6da411abc0ba29"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.507692 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.524317 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.537851 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.537894 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.537907 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.537927 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.537939 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.542262 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.558064 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.574743 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.592377 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.607278 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.634085 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.640604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.640670 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.640698 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.640731 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.640755 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.656729 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.678908 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.705739 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.727443 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.738530 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-h72qm"] Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.739364 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.739500 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.744009 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.744074 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.744094 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.744121 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.744140 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.749518 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.769463 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.786971 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.802538 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.818024 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.840914 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.847131 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.847180 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.847198 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.847222 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.847240 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.859716 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.859855 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:21:30.859826108 +0000 UTC m=+52.390456559 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.860007 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.860053 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj58x\" (UniqueName: \"kubernetes.io/projected/5af69f46-757a-4fab-adbd-d7a278868c94-kube-api-access-rj58x\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.864255 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.882782 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.898630 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.912139 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.926165 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.939720 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.950232 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.950292 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.950316 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.950347 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.950401 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:14Z","lastTransitionTime":"2026-02-17T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.955204 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.961505 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.961584 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.961662 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.961717 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj58x\" (UniqueName: \"kubernetes.io/projected/5af69f46-757a-4fab-adbd-d7a278868c94-kube-api-access-rj58x\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.961777 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961806 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961841 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961861 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961888 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961807 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961931 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:30.961903528 +0000 UTC m=+52.492533979 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961965 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961889 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.961962 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.961836 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.962054 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.962035 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:30.96200722 +0000 UTC m=+52.492637671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.962258 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:30.962237006 +0000 UTC m=+52.492867417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.962274 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:15.462266877 +0000 UTC m=+36.992897288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:14 crc kubenswrapper[4806]: E0217 15:21:14.962286 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:30.962280647 +0000 UTC m=+52.492911058 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.969372 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.985197 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj58x\" (UniqueName: \"kubernetes.io/projected/5af69f46-757a-4fab-adbd-d7a278868c94-kube-api-access-rj58x\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:14 crc kubenswrapper[4806]: I0217 15:21:14.999651 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:14Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.011650 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.023164 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.032767 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.045279 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.053264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.053300 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.053311 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.053332 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.053344 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.060388 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.085937 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.126165 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:56:20.879682314 +0000 UTC Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.141772 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.141831 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.142036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.142057 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.142071 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.156709 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.160041 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.160207 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.160593 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.160699 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.160595 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.160806 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.163293 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.163361 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.163388 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.163443 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.163466 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.182999 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.187822 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.187893 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.187913 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.187941 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.187960 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.208233 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.212376 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.212463 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.212481 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.212505 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.212523 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.231143 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.235372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.235467 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.235486 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.235511 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.235529 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.254878 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:15Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.254998 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.257361 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.257452 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.257479 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.257508 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.257533 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.360072 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.360135 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.360151 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.360176 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.360197 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.463336 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.463396 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.463451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.463476 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.463493 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.466996 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.467242 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:15 crc kubenswrapper[4806]: E0217 15:21:15.467346 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:16.46731805 +0000 UTC m=+37.997948501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.566550 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.566639 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.566654 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.566677 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.566694 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.670325 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.670398 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.670456 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.670496 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.670522 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.773489 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.773598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.773614 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.773636 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.773653 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.877225 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.877294 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.877312 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.877341 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.877361 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.980484 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.980583 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.980602 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.980626 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:15 crc kubenswrapper[4806]: I0217 15:21:15.980644 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:15Z","lastTransitionTime":"2026-02-17T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.089779 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.089831 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.089844 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.089864 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.089878 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.126465 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 18:03:25.629846439 +0000 UTC Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.160211 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:16 crc kubenswrapper[4806]: E0217 15:21:16.160483 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.161376 4806 scope.go:117] "RemoveContainer" containerID="c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.193279 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.193335 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.193352 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.193378 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.193395 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.295807 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.295871 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.295888 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.295913 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.295930 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.399079 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.399146 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.399169 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.399201 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.399264 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.478742 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:16 crc kubenswrapper[4806]: E0217 15:21:16.479019 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:16 crc kubenswrapper[4806]: E0217 15:21:16.479173 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:18.479129083 +0000 UTC m=+40.009759544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.490624 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.493013 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.493958 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.501391 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.501450 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.501463 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.501481 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.501493 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.511748 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.532121 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.547122 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.561025 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.573242 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.583931 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.596680 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.603654 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.603693 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.603709 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.603731 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.603747 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.612089 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.625339 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.640864 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.658162 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.669754 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.679341 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.701513 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.706394 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.706498 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.706526 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.706558 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.706598 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.721654 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.735850 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.750858 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.810150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.810214 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.810230 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.810253 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.810273 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.912966 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.913057 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.913097 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.913135 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:16 crc kubenswrapper[4806]: I0217 15:21:16.913158 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:16Z","lastTransitionTime":"2026-02-17T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.016785 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.016853 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.016871 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.016901 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.016925 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.118937 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.118997 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.119015 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.119043 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.119066 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.127638 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 10:30:58.184135011 +0000 UTC Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.161055 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.161173 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.161090 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:17 crc kubenswrapper[4806]: E0217 15:21:17.161224 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:17 crc kubenswrapper[4806]: E0217 15:21:17.161447 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:17 crc kubenswrapper[4806]: E0217 15:21:17.161575 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.222727 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.222803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.222819 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.222843 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.222861 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.326330 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.326390 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.326440 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.326471 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.326490 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.432182 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.432268 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.432295 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.432345 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.432371 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.535321 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.535372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.535384 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.535439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.535456 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.638254 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.638311 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.638327 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.638351 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.638368 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.741614 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.741683 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.741700 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.741725 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.741745 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.845205 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.845269 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.845278 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.845295 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.845305 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.948486 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.948588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.948609 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.948637 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:17 crc kubenswrapper[4806]: I0217 15:21:17.948654 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:17Z","lastTransitionTime":"2026-02-17T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.051868 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.051961 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.051980 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.052008 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.052027 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.128794 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 12:56:59.619655489 +0000 UTC Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.155341 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.155488 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.155508 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.155532 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.155551 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.160713 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:18 crc kubenswrapper[4806]: E0217 15:21:18.160920 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.257923 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.258010 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.258029 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.258054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.258073 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.361835 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.361900 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.361919 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.361945 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.361962 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.465398 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.465510 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.465530 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.465560 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.465579 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.505198 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:18 crc kubenswrapper[4806]: E0217 15:21:18.505442 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:18 crc kubenswrapper[4806]: E0217 15:21:18.505562 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:22.505531367 +0000 UTC m=+44.036161818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.569349 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.569467 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.569488 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.569517 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.569537 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.672967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.673074 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.673095 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.673123 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.673146 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.775899 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.776016 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.776034 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.776058 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.776075 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.879490 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.879571 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.879613 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.879640 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.879656 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.983133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.983186 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.983203 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.983227 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:18 crc kubenswrapper[4806]: I0217 15:21:18.983243 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:18Z","lastTransitionTime":"2026-02-17T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.086130 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.086215 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.086287 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.086319 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.086340 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.129880 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 10:25:23.836810406 +0000 UTC Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.160751 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.160883 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.160751 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:19 crc kubenswrapper[4806]: E0217 15:21:19.160961 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:19 crc kubenswrapper[4806]: E0217 15:21:19.161096 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:19 crc kubenswrapper[4806]: E0217 15:21:19.161207 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.181312 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.190380 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.190518 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.190550 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.190590 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.191820 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.201459 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.232008 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39d0c058f021e2c4744ba355d5d5d3f6b1b57194a1e3a44d01a919f1fabf5e35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"message\\\":\\\"ctory.go:160\\\\nI0217 15:21:09.230234 6036 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230355 6036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230514 6036 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.230704 6036 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 15:21:09.230742 6036 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 15:21:09.230777 6036 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 15:21:09.230147 6036 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.230908 6036 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 15:21:09.231204 6036 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 15:21:09.231281 6036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 15:21:09.231318 6036 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.248243 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.270972 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.293031 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.295672 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.295754 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.295774 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.295803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.295822 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.309250 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.326881 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.345959 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.369787 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.389905 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.397872 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.397910 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.397922 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.397938 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.397953 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.405287 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.416515 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.438485 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.454577 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.466433 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.478184 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:19Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.500483 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.500536 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.500545 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.500561 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.500575 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.604018 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.604101 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.604121 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.604553 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.604651 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.707616 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.707669 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.707680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.707698 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.707713 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.811614 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.811695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.811715 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.811746 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.811769 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.915176 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.915238 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.915255 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.915280 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:19 crc kubenswrapper[4806]: I0217 15:21:19.915297 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:19Z","lastTransitionTime":"2026-02-17T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.018387 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.018570 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.018610 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.018642 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.018666 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.121485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.121579 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.121598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.121624 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.121641 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.130333 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 04:59:03.325351035 +0000 UTC Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.160178 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:20 crc kubenswrapper[4806]: E0217 15:21:20.160386 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.225040 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.225095 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.225111 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.225133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.225151 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.327611 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.327705 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.327724 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.327750 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.327766 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.430442 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.430528 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.430546 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.430573 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.430590 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.533835 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.533898 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.533915 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.533940 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.533956 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.636829 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.636895 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.636952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.636978 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.636997 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.740528 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.740589 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.740616 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.740645 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.740669 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.843904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.843950 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.843968 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.844003 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.844039 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.947359 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.947456 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.947474 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.947501 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:20 crc kubenswrapper[4806]: I0217 15:21:20.947540 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:20Z","lastTransitionTime":"2026-02-17T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.051032 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.051115 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.051133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.051159 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.051177 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.131519 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 19:31:09.551975446 +0000 UTC Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.155505 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.155617 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.155640 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.155667 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.155685 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.160940 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.160996 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.160959 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:21 crc kubenswrapper[4806]: E0217 15:21:21.161161 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:21 crc kubenswrapper[4806]: E0217 15:21:21.161335 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:21 crc kubenswrapper[4806]: E0217 15:21:21.161515 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.259621 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.259687 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.259706 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.259732 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.259750 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.364216 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.364545 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.364704 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.364843 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.364950 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.468820 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.468893 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.468917 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.468947 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.468979 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.572046 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.572112 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.572135 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.572165 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.572185 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.675799 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.675871 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.675894 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.675926 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.675951 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.778980 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.779028 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.779040 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.779060 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.779072 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.882258 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.882342 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.882370 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.882446 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.882477 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.986395 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.986493 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.986517 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.986577 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:21 crc kubenswrapper[4806]: I0217 15:21:21.986596 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:21Z","lastTransitionTime":"2026-02-17T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.090626 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.090704 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.090729 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.090758 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.090781 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.132204 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 10:02:20.789411551 +0000 UTC Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.160829 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:22 crc kubenswrapper[4806]: E0217 15:21:22.161019 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.194472 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.194568 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.194592 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.194627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.194644 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.297216 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.297289 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.297314 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.297343 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.297360 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.400967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.401053 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.401073 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.401097 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.401113 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.503870 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.503926 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.503944 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.503967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.503984 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.554272 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:22 crc kubenswrapper[4806]: E0217 15:21:22.554484 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:22 crc kubenswrapper[4806]: E0217 15:21:22.554563 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:30.554540947 +0000 UTC m=+52.085171388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.607086 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.607152 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.607172 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.607202 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.607222 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.710148 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.710211 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.710228 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.710254 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.710285 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.813740 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.813795 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.813813 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.813837 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.813855 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.916922 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.916986 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.917026 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.917045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:22 crc kubenswrapper[4806]: I0217 15:21:22.917058 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:22Z","lastTransitionTime":"2026-02-17T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.019626 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.019690 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.019718 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.019750 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.019770 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.122893 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.122983 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.123003 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.123032 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.123054 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.132692 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:56:10.097737681 +0000 UTC Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.160215 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.160255 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.160448 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:23 crc kubenswrapper[4806]: E0217 15:21:23.160636 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:23 crc kubenswrapper[4806]: E0217 15:21:23.160764 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:23 crc kubenswrapper[4806]: E0217 15:21:23.160957 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.226348 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.226444 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.226471 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.226502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.226526 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.329334 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.329396 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.329456 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.329488 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.329516 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.432160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.432229 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.432246 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.432270 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.432288 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.534983 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.535053 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.535080 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.535108 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.535127 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.638839 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.638916 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.638969 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.638995 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.639013 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.741914 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.741985 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.742007 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.742036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.742288 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.845837 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.845929 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.845954 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.845988 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.846012 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.949738 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.949801 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.949818 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.949844 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:23 crc kubenswrapper[4806]: I0217 15:21:23.949862 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:23Z","lastTransitionTime":"2026-02-17T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.053287 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.053348 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.053365 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.053391 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.053430 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.133530 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 09:38:16.211831897 +0000 UTC Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.156200 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.156257 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.156274 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.156300 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.156319 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.160673 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:24 crc kubenswrapper[4806]: E0217 15:21:24.160871 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.258909 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.258963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.258981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.259006 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.259025 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.363533 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.363604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.363621 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.363645 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.363663 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.466867 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.466950 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.466967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.466993 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.467010 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.571074 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.571144 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.571161 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.571189 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.571210 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.675014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.675088 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.675107 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.675134 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.675152 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.778262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.778329 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.778347 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.778374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.778391 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.881609 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.881666 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.881684 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.881708 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.881725 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.984949 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.985016 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.985034 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.985059 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:24 crc kubenswrapper[4806]: I0217 15:21:24.985079 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:24Z","lastTransitionTime":"2026-02-17T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.087182 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.087227 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.087238 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.087253 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.087265 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.134324 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 07:36:48.385253364 +0000 UTC Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.160456 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.160583 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.160718 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.160861 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.161589 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.161712 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.162056 4806 scope.go:117] "RemoveContainer" containerID="6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.178131 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.189753 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.190269 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.190288 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.190315 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.190332 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.190871 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.205304 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.217757 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.232363 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.252602 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.258781 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.258823 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.258840 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.258867 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.258887 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.269331 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.276433 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.280959 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.281010 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.281022 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.281041 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.281053 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.287473 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.294118 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.298713 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.298772 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.298788 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.298810 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.298825 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.306048 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.316255 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.319997 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.320237 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.320470 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.320680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.320859 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.322888 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.337654 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.340386 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.344839 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.344876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.344887 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.344908 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.344921 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.365612 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: E0217 15:21:25.365758 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.366384 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.367761 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.368483 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.368499 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.368516 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.368526 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.382816 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.400571 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.418456 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.430719 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.442581 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.470379 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.470435 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.470451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.470471 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.470482 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.530278 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/1.log" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.533487 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.534562 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.558935 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.573793 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.573872 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.573894 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.573924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.573946 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.582776 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.605234 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.622325 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.643286 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.665115 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.677287 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.677331 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.677346 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.677366 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.677378 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.700681 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.713840 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.729140 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.744226 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.755046 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.766289 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.778264 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.779951 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.779993 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.780005 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.780023 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.780036 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.791590 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.801591 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.811510 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.825133 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:25Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.883143 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.883191 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.883206 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.883223 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.883236 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.991362 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.991470 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.991497 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.991532 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:25 crc kubenswrapper[4806]: I0217 15:21:25.991557 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:25Z","lastTransitionTime":"2026-02-17T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.094231 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.095589 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.095809 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.096009 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.096221 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.135202 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:44:59.739015679 +0000 UTC Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.160873 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:26 crc kubenswrapper[4806]: E0217 15:21:26.161083 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.199593 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.199633 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.199651 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.199672 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.199691 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.302969 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.303048 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.303066 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.303094 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.303112 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.406083 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.406332 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.406350 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.406379 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.406398 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.508893 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.508960 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.508979 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.509007 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.509030 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.541007 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/2.log" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.541741 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/1.log" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.544497 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f" exitCode=1 Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.544531 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.544564 4806 scope.go:117] "RemoveContainer" containerID="6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.545124 4806 scope.go:117] "RemoveContainer" containerID="012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f" Feb 17 15:21:26 crc kubenswrapper[4806]: E0217 15:21:26.545260 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.568550 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.590494 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.611366 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.611437 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.611451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.611500 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.611515 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.611925 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.635434 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.651752 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.670594 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.691209 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.711994 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.713312 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.713357 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.713373 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.713398 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.713449 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.734543 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.751668 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.765162 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.778026 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.810776 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.821109 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.821156 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.821168 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.821187 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.821200 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.828832 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.847752 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.863878 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.895499 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c223c9adce2d31376196ea9ccfe5d2c3176e4d57d2c47b80838a6e8ff2b2b36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:10Z\\\",\\\"message\\\":\\\"ole/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"ab0b1d51-5ec6-479b-8881-93dfa8d30337\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-network-console/networking-console-plugin_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.246\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 15:21:10.358963 6216 lb_config.go:1031] Cluster endpoints for openshift-marketplace/community-operators for network=default are: map[]\\\\nI0217 15:21:10.358982 6216 services_co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:26Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.923921 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.923971 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.923988 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.924013 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:26 crc kubenswrapper[4806]: I0217 15:21:26.924031 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:26Z","lastTransitionTime":"2026-02-17T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.026964 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.027058 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.027077 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.027101 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.027123 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.130242 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.130696 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.130853 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.130983 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.131115 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.135709 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:17:59.663803452 +0000 UTC Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.160755 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.161019 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.160844 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:27 crc kubenswrapper[4806]: E0217 15:21:27.161221 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:27 crc kubenswrapper[4806]: E0217 15:21:27.161377 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:27 crc kubenswrapper[4806]: E0217 15:21:27.161474 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.234069 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.234143 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.234165 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.234192 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.234210 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.336434 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.336485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.336501 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.336524 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.336537 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.440436 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.440502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.440521 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.440549 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.440567 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.542974 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.543030 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.543046 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.543067 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.543083 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.550374 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/2.log" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.556390 4806 scope.go:117] "RemoveContainer" containerID="012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f" Feb 17 15:21:27 crc kubenswrapper[4806]: E0217 15:21:27.556734 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.576904 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.600134 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.625969 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.646352 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.646444 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.646473 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.646503 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.646525 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.652536 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.679981 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.705312 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.731839 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.749029 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.749105 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.749128 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.749156 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.749173 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.753617 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.771180 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.807131 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.829637 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.849230 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.851841 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.851868 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.851880 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.851898 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.851910 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.866736 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.884153 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.907059 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.925338 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.954886 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.954963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.954990 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.955023 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.955046 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:27Z","lastTransitionTime":"2026-02-17T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:27 crc kubenswrapper[4806]: I0217 15:21:27.960077 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:27Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.057852 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.057921 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.057939 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.057982 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.057999 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.136755 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 02:14:35.920554844 +0000 UTC Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.160320 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:28 crc kubenswrapper[4806]: E0217 15:21:28.160596 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.160757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.160802 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.160819 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.160849 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.160869 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.264560 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.264633 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.264651 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.264675 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.264695 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.367534 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.367577 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.367589 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.367608 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.367622 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.471032 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.471090 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.471101 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.471120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.471132 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.574447 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.574523 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.574542 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.574569 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.574588 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.677120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.677175 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.677191 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.677211 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.677226 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.779820 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.779893 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.779918 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.779949 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.779972 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.883397 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.883829 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.884103 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.884328 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.884587 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.987915 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.987989 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.988012 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.988042 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:28 crc kubenswrapper[4806]: I0217 15:21:28.988063 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:28Z","lastTransitionTime":"2026-02-17T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.091141 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.091202 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.091220 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.091264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.091324 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.138217 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 15:45:48.985511178 +0000 UTC Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.161231 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.161316 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.161350 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:29 crc kubenswrapper[4806]: E0217 15:21:29.163189 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:29 crc kubenswrapper[4806]: E0217 15:21:29.165151 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:29 crc kubenswrapper[4806]: E0217 15:21:29.165400 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.184792 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.194329 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.194397 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.194454 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.194482 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.194500 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.203718 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.227438 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.246213 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.278369 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.297869 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.298156 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.298180 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.298191 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.298208 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.298219 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.315697 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.344799 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.363886 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.380988 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.397303 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.401928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.401962 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.401973 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.401991 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.402003 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.424983 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.441540 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.463071 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.478841 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.498218 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.504646 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.504722 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.504741 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.504766 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.504783 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.514512 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:29Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.607433 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.607803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.607915 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.608014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.608131 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.711876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.711936 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.711957 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.711982 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.712000 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.814944 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.815006 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.815023 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.815048 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.815065 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.917551 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.917620 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.917639 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.917689 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:29 crc kubenswrapper[4806]: I0217 15:21:29.917707 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:29Z","lastTransitionTime":"2026-02-17T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.021045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.021112 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.021133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.021160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.021179 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.124940 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.125265 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.130004 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.130865 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.130934 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.139728 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 14:28:08.970492137 +0000 UTC Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.159968 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:30 crc kubenswrapper[4806]: E0217 15:21:30.160130 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.234670 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.234748 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.234772 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.234800 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.234820 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.337994 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.338043 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.338061 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.338085 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.338102 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.441236 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.441293 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.441310 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.441333 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.441351 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.544439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.544513 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.544537 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.544572 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.544593 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.642897 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:30 crc kubenswrapper[4806]: E0217 15:21:30.643290 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:30 crc kubenswrapper[4806]: E0217 15:21:30.643453 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:21:46.643395262 +0000 UTC m=+68.174025713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.648568 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.648625 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.648648 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.648681 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.648707 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.751707 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.751779 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.751803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.751838 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.751862 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.855192 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.855268 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.855288 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.855716 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.855766 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.947523 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:21:30 crc kubenswrapper[4806]: E0217 15:21:30.947723 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:22:02.947679882 +0000 UTC m=+84.478310333 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.958814 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.958901 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.958924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.958955 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:30 crc kubenswrapper[4806]: I0217 15:21:30.958980 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:30Z","lastTransitionTime":"2026-02-17T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.049270 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.049321 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.049350 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.049382 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.049548 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.049570 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.049582 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.049658 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:22:03.049616728 +0000 UTC m=+84.580247139 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.049977 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.050076 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.050038 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.050137 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.050149 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.050260 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:22:03.05009133 +0000 UTC m=+84.580721741 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.050340 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:22:03.050330986 +0000 UTC m=+84.580961397 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.050421 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:22:03.050413348 +0000 UTC m=+84.581043759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.060924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.060976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.060992 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.061012 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.061024 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.140644 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 07:14:08.075167255 +0000 UTC Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.160259 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.160271 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.160279 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.160518 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.160608 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:31 crc kubenswrapper[4806]: E0217 15:21:31.160716 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.164201 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.164236 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.164246 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.164264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.164281 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.267174 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.267239 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.267257 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.267287 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.267306 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.370232 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.370309 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.370335 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.370372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.370398 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.379504 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.391698 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.396805 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.418022 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.433084 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.451458 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.468825 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.472529 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.472563 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.472571 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.472586 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.472602 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.485139 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.504816 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.521101 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.535225 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.556322 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.574976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.575021 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.575032 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.575054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.575070 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.577655 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.600952 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.614259 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.628262 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.646826 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.665029 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.677598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.677635 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.677652 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.677673 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.677688 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.697850 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:31Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.781065 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.781153 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.781178 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.781206 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.781227 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.884854 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.884901 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.884915 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.884935 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.884949 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.988371 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.988485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.988510 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.988538 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:31 crc kubenswrapper[4806]: I0217 15:21:31.988556 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:31Z","lastTransitionTime":"2026-02-17T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.091691 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.091757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.091774 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.091799 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.091818 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.141679 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 01:00:02.026611111 +0000 UTC Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.160024 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:32 crc kubenswrapper[4806]: E0217 15:21:32.160290 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.195095 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.195210 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.195234 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.195269 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.195295 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.298538 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.298601 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.298628 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.298658 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.298680 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.401730 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.401792 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.401813 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.401842 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.401866 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.504828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.504904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.504930 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.504960 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.504984 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.608065 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.608129 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.608145 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.608171 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.608191 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.711966 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.712050 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.712075 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.712109 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.712130 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.815281 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.815354 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.815372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.815401 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.815450 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.919232 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.919305 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.919323 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.919349 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:32 crc kubenswrapper[4806]: I0217 15:21:32.919367 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:32Z","lastTransitionTime":"2026-02-17T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.022465 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.022529 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.022547 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.022571 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.022588 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.065978 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.089932 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.110108 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.125538 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.125607 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.125634 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.125667 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.125693 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.129937 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.142829 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 17:22:53.097364215 +0000 UTC Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.147773 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.160762 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.160798 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:33 crc kubenswrapper[4806]: E0217 15:21:33.160872 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.160761 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:33 crc kubenswrapper[4806]: E0217 15:21:33.161085 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:33 crc kubenswrapper[4806]: E0217 15:21:33.161255 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.168360 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.186898 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.221452 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.229307 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.229394 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.229450 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.229489 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.229509 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.242910 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.264341 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.281916 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.298844 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.320175 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.336600 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.336659 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.336674 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.336692 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.336705 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.342820 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.373476 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.389116 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.403800 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.426488 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.439651 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.439704 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.439725 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.439750 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.439770 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.446071 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:33Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.542399 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.542457 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.542468 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.542483 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.542493 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.645036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.645111 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.645137 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.645166 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.645188 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.748356 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.748455 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.748475 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.748501 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.748524 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.851557 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.851627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.851645 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.851673 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.851694 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.955373 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.955452 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.955470 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.955493 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:33 crc kubenswrapper[4806]: I0217 15:21:33.955513 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:33Z","lastTransitionTime":"2026-02-17T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.058875 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.058963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.058988 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.059024 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.059081 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.143241 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 14:01:15.663979247 +0000 UTC Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.160677 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:34 crc kubenswrapper[4806]: E0217 15:21:34.160896 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.162958 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.163019 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.163029 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.163050 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.163062 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.266810 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.266862 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.266876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.266895 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.266908 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.369795 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.369861 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.369881 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.369905 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.369926 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.472350 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.472488 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.472515 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.472546 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.472573 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.575967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.576051 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.576073 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.576097 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.576118 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.679373 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.679457 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.679475 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.679498 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.679517 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.786769 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.786841 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.786862 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.786888 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.786908 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.890849 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.890918 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.890939 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.890965 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.890987 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.994685 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.994756 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.994775 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.994801 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:34 crc kubenswrapper[4806]: I0217 15:21:34.994819 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:34Z","lastTransitionTime":"2026-02-17T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.097736 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.097801 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.097823 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.097849 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.097867 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.143845 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 19:24:03.514590331 +0000 UTC Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.160390 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.160492 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.160501 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.160627 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.160791 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.160829 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.200239 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.200284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.200299 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.200316 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.200325 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.302987 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.303036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.303045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.303060 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.303072 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.405556 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.405627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.405647 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.405671 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.405687 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.508396 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.508511 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.508531 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.508559 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.508577 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.610116 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.610158 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.610169 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.610186 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.610199 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.650494 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.650572 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.650590 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.650617 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.650635 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.671164 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:35Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.676975 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.677040 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.677060 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.677514 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.677574 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.698662 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:35Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.704491 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.704564 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.704587 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.704619 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.704641 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.725370 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:35Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.731769 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.731962 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.732153 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.732362 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.732606 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.753752 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:35Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.759196 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.759262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.759285 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.759316 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.759337 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.779133 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:35Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:35 crc kubenswrapper[4806]: E0217 15:21:35.779356 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.785359 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.785476 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.785506 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.785542 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.785578 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.890456 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.890541 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.890561 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.890587 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.890604 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.993359 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.993466 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.993489 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.993523 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:35 crc kubenswrapper[4806]: I0217 15:21:35.993548 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:35Z","lastTransitionTime":"2026-02-17T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.096733 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.096816 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.096851 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.096882 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.096904 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.144214 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 06:45:48.692480204 +0000 UTC Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.160688 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:36 crc kubenswrapper[4806]: E0217 15:21:36.160902 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.199718 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.199804 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.199828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.199860 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.199882 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.302248 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.302296 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.302313 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.302335 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.302353 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.405135 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.405205 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.405223 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.405249 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.405266 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.508354 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.508388 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.508398 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.508427 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.508435 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.611036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.611096 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.611118 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.611143 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.611162 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.714164 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.714227 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.714245 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.714272 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.714289 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.817322 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.817378 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.817396 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.817449 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.817467 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.921182 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.921252 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.921274 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.921305 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:36 crc kubenswrapper[4806]: I0217 15:21:36.921328 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:36Z","lastTransitionTime":"2026-02-17T15:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.024691 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.024765 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.024804 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.024836 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.024857 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.128665 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.128734 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.128752 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.128777 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.128796 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.144497 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 03:18:47.478862262 +0000 UTC Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.160979 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.161017 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.161130 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:37 crc kubenswrapper[4806]: E0217 15:21:37.161168 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:37 crc kubenswrapper[4806]: E0217 15:21:37.161365 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:37 crc kubenswrapper[4806]: E0217 15:21:37.161560 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.231756 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.231832 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.231849 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.231875 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.231894 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.334889 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.334985 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.335019 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.335053 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.335076 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.438564 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.438843 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.438868 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.438899 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.438926 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.541765 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.541827 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.541844 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.541877 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.541898 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.646174 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.646242 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.646261 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.646294 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.646314 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.749881 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.750065 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.750082 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.750101 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.750114 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.858235 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.858292 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.858306 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.858325 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.858339 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.961623 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.961695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.961713 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.961742 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:37 crc kubenswrapper[4806]: I0217 15:21:37.961760 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:37Z","lastTransitionTime":"2026-02-17T15:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.065611 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.065673 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.065691 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.065719 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.065737 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.144960 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 10:38:21.759446915 +0000 UTC Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.160390 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:38 crc kubenswrapper[4806]: E0217 15:21:38.160569 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.169122 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.169176 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.169195 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.169219 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.169237 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.272073 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.272133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.272153 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.272178 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.272198 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.375118 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.375161 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.375171 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.375187 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.375198 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.477907 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.477976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.477994 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.478021 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.478039 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.581958 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.582048 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.582072 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.582105 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.582129 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.686069 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.686158 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.686184 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.686220 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.686242 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.790149 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.790235 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.790253 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.790279 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.790297 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.893168 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.893262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.893290 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.893323 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.893346 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.996629 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.996693 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.996714 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.996742 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:38 crc kubenswrapper[4806]: I0217 15:21:38.996762 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:38Z","lastTransitionTime":"2026-02-17T15:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.099593 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.099674 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.099695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.099723 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.099743 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.146978 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 02:51:09.683869697 +0000 UTC Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.160786 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:39 crc kubenswrapper[4806]: E0217 15:21:39.160961 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.161310 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:39 crc kubenswrapper[4806]: E0217 15:21:39.161568 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.161615 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:39 crc kubenswrapper[4806]: E0217 15:21:39.161762 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.181269 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.204519 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.204604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.204629 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.204661 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.204687 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.219901 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.239343 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.265635 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.285444 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.303881 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.308688 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.309052 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.309356 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.309669 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.309919 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.324375 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.339800 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.370389 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.390341 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.409020 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.413528 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.413695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.413801 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.413890 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.413980 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.431150 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.449762 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.472798 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.489541 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.501307 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.516014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.516047 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.516058 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.516073 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.516084 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.519613 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.532699 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:39Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.619014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.619053 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.619064 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.619083 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.619098 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.722352 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.722448 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.722472 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.722502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.722526 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.825504 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.825557 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.825570 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.825588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.825601 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.928884 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.928928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.928937 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.928953 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:39 crc kubenswrapper[4806]: I0217 15:21:39.928962 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:39Z","lastTransitionTime":"2026-02-17T15:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.032315 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.032355 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.032366 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.032380 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.032390 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.135632 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.135765 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.135785 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.135815 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.135842 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.147163 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 08:06:41.357406279 +0000 UTC Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.159961 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:40 crc kubenswrapper[4806]: E0217 15:21:40.160174 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.239001 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.239083 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.239102 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.239127 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.239148 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.342264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.342329 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.342347 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.342378 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.342396 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.445009 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.445068 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.445088 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.445111 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.445128 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.547765 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.547830 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.547847 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.547873 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.547892 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.650743 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.650816 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.650836 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.650861 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.650880 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.753828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.753911 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.753936 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.753967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.753992 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.857313 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.857806 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.857976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.858127 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.858268 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.962112 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.962185 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.962206 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.962238 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:40 crc kubenswrapper[4806]: I0217 15:21:40.962260 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:40Z","lastTransitionTime":"2026-02-17T15:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.065540 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.065591 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.065609 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.065645 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.065661 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.147990 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:16:40.027276513 +0000 UTC Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.164494 4806 scope.go:117] "RemoveContainer" containerID="012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f" Feb 17 15:21:41 crc kubenswrapper[4806]: E0217 15:21:41.165862 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.164836 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:41 crc kubenswrapper[4806]: E0217 15:21:41.166293 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.164787 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:41 crc kubenswrapper[4806]: E0217 15:21:41.166785 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.164858 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:41 crc kubenswrapper[4806]: E0217 15:21:41.168221 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.172229 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.172284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.172315 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.172352 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.172372 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.275367 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.275458 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.275478 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.275503 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.275520 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.378090 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.378121 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.378128 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.378142 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.378151 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.481450 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.481500 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.481517 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.481539 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.481557 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.584108 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.584178 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.584195 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.584227 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.584244 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.688327 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.688393 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.688437 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.688466 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.688486 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.791304 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.791357 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.791370 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.791391 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.791455 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.894518 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.894587 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.894605 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.894629 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.894651 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.997814 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.997878 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.997900 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.997924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:41 crc kubenswrapper[4806]: I0217 15:21:41.997944 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:41Z","lastTransitionTime":"2026-02-17T15:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.100580 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.100649 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.100667 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.100694 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.100715 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.149185 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:56:14.135751956 +0000 UTC Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.160750 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:42 crc kubenswrapper[4806]: E0217 15:21:42.160946 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.203500 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.203575 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.203593 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.203621 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.203640 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.306104 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.306154 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.306173 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.306198 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.306220 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.408745 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.408816 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.408836 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.408860 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.408879 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.512338 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.512396 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.512424 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.512443 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.512461 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.615638 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.615674 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.615688 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.615710 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.615725 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.719010 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.719076 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.719094 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.719120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.719139 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.822749 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.822828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.822853 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.822887 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.822910 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.926656 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.926730 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.926757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.926790 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:42 crc kubenswrapper[4806]: I0217 15:21:42.926813 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:42Z","lastTransitionTime":"2026-02-17T15:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.030229 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.030284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.030303 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.030331 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.030349 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.134831 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.134911 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.134935 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.134966 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.134985 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.150299 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 14:20:21.288692569 +0000 UTC Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.160730 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.160759 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.160850 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:43 crc kubenswrapper[4806]: E0217 15:21:43.160930 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:43 crc kubenswrapper[4806]: E0217 15:21:43.161075 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:43 crc kubenswrapper[4806]: E0217 15:21:43.161288 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.237854 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.237889 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.237900 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.237918 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.237931 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.340610 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.340664 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.340674 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.340692 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.340701 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.446812 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.446908 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.446932 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.446963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.446990 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.549812 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.549877 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.549895 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.549921 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.549938 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.652502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.652574 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.652598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.652626 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.652643 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.755779 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.755861 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.755900 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.755935 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.755959 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.859394 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.859481 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.859500 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.859524 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.859540 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.962477 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.962560 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.962581 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.962605 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:43 crc kubenswrapper[4806]: I0217 15:21:43.962622 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:43Z","lastTransitionTime":"2026-02-17T15:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.066598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.066653 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.066670 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.066700 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.066717 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.150924 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 17:45:43.529527237 +0000 UTC Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.160299 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:44 crc kubenswrapper[4806]: E0217 15:21:44.160509 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.169018 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.169048 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.169056 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.169067 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.169078 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.271263 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.271292 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.271300 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.271330 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.271341 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.373956 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.374007 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.374020 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.374039 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.374051 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.477582 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.477646 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.477664 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.477691 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.477708 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.580518 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.580575 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.580585 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.580604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.580976 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.684312 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.684368 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.684392 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.684439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.684454 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.787658 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.787715 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.787726 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.787748 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.787761 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.890828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.890895 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.890905 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.890928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.890940 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.993773 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.993831 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.993843 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.993864 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:44 crc kubenswrapper[4806]: I0217 15:21:44.993879 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:44Z","lastTransitionTime":"2026-02-17T15:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.097060 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.097107 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.097117 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.097133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.097143 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.152119 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 23:54:42.199931932 +0000 UTC Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.160506 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.160512 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.160541 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.160728 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.160837 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.160956 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.200154 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.200228 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.200246 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.200276 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.200298 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.303683 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.303730 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.303745 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.303768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.303785 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.406463 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.406502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.406513 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.406531 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.406544 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.509647 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.509759 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.509776 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.509804 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.509821 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.612685 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.612745 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.612756 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.612772 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.612781 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.715821 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.715894 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.715910 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.715935 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.715953 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.819122 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.819185 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.819202 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.819227 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.819245 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.871782 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.871819 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.871828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.871842 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.871850 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.888669 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:45Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.892710 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.892838 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.892860 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.892888 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.892908 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.910947 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:45Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.915532 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.915687 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.915768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.915857 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.915950 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.929138 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:45Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.933578 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.933657 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.933683 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.933715 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.933734 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.949577 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:45Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.954445 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.954503 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.954529 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.954557 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.954580 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.970533 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:45Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:45 crc kubenswrapper[4806]: E0217 15:21:45.970646 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.972501 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.972647 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.972742 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.972829 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:45 crc kubenswrapper[4806]: I0217 15:21:45.972923 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:45Z","lastTransitionTime":"2026-02-17T15:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.075696 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.075785 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.075802 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.075827 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.075846 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.153032 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 14:37:41.597395128 +0000 UTC Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.160521 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:46 crc kubenswrapper[4806]: E0217 15:21:46.160686 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.178161 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.178222 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.178242 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.178268 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.178284 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.280629 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.280667 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.280677 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.280691 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.280700 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.383552 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.383588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.383598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.383612 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.383619 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.486308 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.486355 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.486374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.486437 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.486457 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.589759 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.589815 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.589831 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.589855 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.589871 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.695307 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.695356 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.695369 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.695399 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.695428 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.737252 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:46 crc kubenswrapper[4806]: E0217 15:21:46.737518 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:46 crc kubenswrapper[4806]: E0217 15:21:46.737645 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:22:18.737614519 +0000 UTC m=+100.268244960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.798388 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.798454 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.798465 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.798504 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.798520 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.901664 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.901747 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.901772 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.901803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:46 crc kubenswrapper[4806]: I0217 15:21:46.901825 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:46Z","lastTransitionTime":"2026-02-17T15:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.004838 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.004880 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.004893 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.004911 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.004923 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.108101 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.108139 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.108156 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.108179 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.108196 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.154171 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 19:50:34.671357052 +0000 UTC Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.160592 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.160628 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.160616 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:47 crc kubenswrapper[4806]: E0217 15:21:47.160834 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:47 crc kubenswrapper[4806]: E0217 15:21:47.160949 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:47 crc kubenswrapper[4806]: E0217 15:21:47.161104 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.210088 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.210120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.210129 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.210145 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.210154 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.312761 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.312804 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.312823 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.312847 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.312865 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.415183 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.415243 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.415260 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.415284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.415300 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.518291 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.518346 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.518359 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.518380 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.518396 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.620636 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.620685 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.620699 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.620719 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.620731 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.723287 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.723322 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.723330 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.723344 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.723352 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.826211 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.826264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.826284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.826314 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.826331 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.929364 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.929446 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.929462 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.929485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:47 crc kubenswrapper[4806]: I0217 15:21:47.929502 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:47Z","lastTransitionTime":"2026-02-17T15:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.032387 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.032462 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.032479 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.032499 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.032512 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.134657 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.134694 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.134704 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.134718 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.134727 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.155190 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 15:20:49.336599913 +0000 UTC Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.160531 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:48 crc kubenswrapper[4806]: E0217 15:21:48.160726 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.236501 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.236535 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.236544 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.236557 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.236566 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.339067 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.339111 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.339120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.339137 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.339146 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.441607 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.441663 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.441686 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.441720 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.441742 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.544971 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.545074 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.545097 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.545126 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.545146 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.649095 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.649135 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.649154 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.649172 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.649183 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.751330 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.751395 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.751451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.751478 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.751495 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.854314 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.854400 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.854430 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.854451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.854463 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.957823 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.957900 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.957923 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.957956 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:48 crc kubenswrapper[4806]: I0217 15:21:48.957979 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:48Z","lastTransitionTime":"2026-02-17T15:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.060776 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.060840 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.060858 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.060886 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.060902 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.155859 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 10:48:41.338431371 +0000 UTC Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.160283 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.160333 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.160355 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:49 crc kubenswrapper[4806]: E0217 15:21:49.160484 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:49 crc kubenswrapper[4806]: E0217 15:21:49.160625 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:49 crc kubenswrapper[4806]: E0217 15:21:49.160768 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.162952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.163003 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.163020 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.163037 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.163047 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.177494 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.196530 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.215245 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.235476 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.249193 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.265650 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.265699 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.265712 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.265732 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.265745 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.266714 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.284018 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.303037 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.319302 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.334249 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.346358 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.364582 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.368971 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.369047 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.369070 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.369101 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.369124 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.397908 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.412108 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.430841 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.446517 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.461554 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.472317 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.472365 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.472377 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.472395 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.472433 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.486676 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:49Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.574622 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.574677 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.574690 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.574710 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.574724 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.676852 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.676920 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.676948 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.676984 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.677008 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.780636 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.780695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.780714 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.780739 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.780756 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.883797 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.884171 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.884180 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.884194 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.884203 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.986942 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.986980 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.986993 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.987009 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:49 crc kubenswrapper[4806]: I0217 15:21:49.987030 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:49Z","lastTransitionTime":"2026-02-17T15:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.089505 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.089536 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.089545 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.089559 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.089569 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.156626 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:24:41.606898483 +0000 UTC Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.160869 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:50 crc kubenswrapper[4806]: E0217 15:21:50.161023 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.191689 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.191714 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.191722 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.191732 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.191741 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.294980 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.295043 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.295060 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.295087 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.295106 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.398022 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.398067 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.398084 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.398108 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.398125 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.500760 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.500823 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.500846 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.500872 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.500892 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.602921 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.602988 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.603006 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.603030 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.603048 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.630972 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/0.log" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.631032 4806 generic.go:334] "Generic (PLEG): container finished" podID="344f8a87-e00f-4f0a-a0bc-aee197271160" containerID="1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc" exitCode=1 Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.631065 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerDied","Data":"1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.631468 4806 scope.go:117] "RemoveContainer" containerID="1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.650237 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.669723 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.693450 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.705938 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.706016 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.706038 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.706063 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.706085 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.709682 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.725627 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.749701 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.770152 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:49Z\\\",\\\"message\\\":\\\"2026-02-17T15:21:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3\\\\n2026-02-17T15:21:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3 to /host/opt/cni/bin/\\\\n2026-02-17T15:21:04Z [verbose] multus-daemon started\\\\n2026-02-17T15:21:04Z [verbose] Readiness Indicator file check\\\\n2026-02-17T15:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.790468 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.808828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.809078 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.809206 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.809316 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.808838 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.809421 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.825779 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.838749 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.850534 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.860736 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.876558 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.895319 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.910720 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.912306 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.912333 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.912342 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.912355 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.912377 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:50Z","lastTransitionTime":"2026-02-17T15:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.927502 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:50 crc kubenswrapper[4806]: I0217 15:21:50.939766 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:50Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.014897 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.014942 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.014952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.014968 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.014980 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.117646 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.117688 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.117697 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.117715 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.117726 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.157694 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 12:30:44.006610064 +0000 UTC Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.159991 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.159995 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:51 crc kubenswrapper[4806]: E0217 15:21:51.160113 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:51 crc kubenswrapper[4806]: E0217 15:21:51.160203 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.160012 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:51 crc kubenswrapper[4806]: E0217 15:21:51.160774 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.220852 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.220952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.221010 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.221041 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.221059 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.324794 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.324837 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.324848 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.324864 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.324875 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.427309 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.427390 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.427450 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.427482 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.427503 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.530373 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.531442 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.531533 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.531626 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.531711 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.634064 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.634123 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.634133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.634148 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.634160 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.637933 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/0.log" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.638068 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerStarted","Data":"4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.650100 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.664764 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.679703 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.697938 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.719771 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.733355 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.736550 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.736610 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.736630 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.736658 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.736677 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.748127 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.765725 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.792195 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.807288 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.821788 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.836671 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.839892 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.839974 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.840001 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.840034 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.840058 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.850107 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.872515 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.886588 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:49Z\\\",\\\"message\\\":\\\"2026-02-17T15:21:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3\\\\n2026-02-17T15:21:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3 to /host/opt/cni/bin/\\\\n2026-02-17T15:21:04Z [verbose] multus-daemon started\\\\n2026-02-17T15:21:04Z [verbose] Readiness Indicator file check\\\\n2026-02-17T15:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.900083 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.912841 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.932162 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:51Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.943012 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.943073 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.943094 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.943120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:51 crc kubenswrapper[4806]: I0217 15:21:51.943137 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:51Z","lastTransitionTime":"2026-02-17T15:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.050080 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.050144 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.050162 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.050186 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.050203 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.153392 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.153445 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.153455 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.153471 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.153481 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.158733 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 14:00:06.003091493 +0000 UTC Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.159984 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:52 crc kubenswrapper[4806]: E0217 15:21:52.160356 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.160573 4806 scope.go:117] "RemoveContainer" containerID="012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.255247 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.255288 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.255298 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.255315 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.255327 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.358803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.358851 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.358860 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.358880 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.358892 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.461729 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.461789 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.461806 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.461833 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.461852 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.565169 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.565217 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.565232 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.565253 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.565270 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.643544 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/2.log" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.646175 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.646796 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.661626 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.667700 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.667760 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.667776 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.667797 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.667812 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.679018 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.695138 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:49Z\\\",\\\"message\\\":\\\"2026-02-17T15:21:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3\\\\n2026-02-17T15:21:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3 to /host/opt/cni/bin/\\\\n2026-02-17T15:21:04Z [verbose] multus-daemon started\\\\n2026-02-17T15:21:04Z [verbose] Readiness Indicator file check\\\\n2026-02-17T15:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.709038 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.719595 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.731709 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.744699 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.759652 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.769768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.769804 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.769819 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.769836 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.769848 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.773675 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.786562 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.800225 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.811057 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.823253 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.839383 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.869422 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.871755 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.871809 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.871821 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.871836 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.871845 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.900175 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.917166 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.935852 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.974563 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.974668 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.974695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.974729 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:52 crc kubenswrapper[4806]: I0217 15:21:52.974752 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:52Z","lastTransitionTime":"2026-02-17T15:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.077708 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.077769 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.077786 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.077811 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.077829 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.159301 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:05:28.987464561 +0000 UTC Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.160719 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.160777 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:53 crc kubenswrapper[4806]: E0217 15:21:53.160954 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.161018 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:53 crc kubenswrapper[4806]: E0217 15:21:53.161201 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:53 crc kubenswrapper[4806]: E0217 15:21:53.161390 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.180479 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.180688 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.180906 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.181076 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.181260 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.290966 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.291025 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.291044 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.291071 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.291091 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.393979 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.394024 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.394036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.394054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.394064 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.498580 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.498680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.498700 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.498729 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.498748 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.602604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.602673 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.602688 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.602713 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.602728 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.653113 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/3.log" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.654157 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/2.log" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.658026 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" exitCode=1 Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.658096 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.658146 4806 scope.go:117] "RemoveContainer" containerID="012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.659461 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:21:53 crc kubenswrapper[4806]: E0217 15:21:53.659793 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.684227 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:49Z\\\",\\\"message\\\":\\\"2026-02-17T15:21:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3\\\\n2026-02-17T15:21:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3 to /host/opt/cni/bin/\\\\n2026-02-17T15:21:04Z [verbose] multus-daemon started\\\\n2026-02-17T15:21:04Z [verbose] Readiness Indicator file check\\\\n2026-02-17T15:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.706307 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.709098 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.709154 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.709174 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.709203 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.709223 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.726675 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.743750 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.757234 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.773929 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.790688 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.806829 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.812932 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.813011 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.813039 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.813073 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.813101 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.824457 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.839871 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.855553 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.906074 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.916506 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.916584 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.916607 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.917129 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.917184 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:53Z","lastTransitionTime":"2026-02-17T15:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.948721 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.967866 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:53 crc kubenswrapper[4806]: I0217 15:21:53.983733 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.002081 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:53Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.015826 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.020553 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.020585 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.020594 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.020611 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.020623 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.033433 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://012714cbfe0cbb4fc6b77757c4d20de70058a7af68a278212cb985c981717e0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:26Z\\\",\\\"message\\\":\\\"dler 7 for removal\\\\nI0217 15:21:26.108862 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 15:21:26.108902 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 15:21:26.108998 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 15:21:26.110286 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 15:21:26.110342 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 15:21:26.110293 6457 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110448 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 15:21:26.110317 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 15:21:26.110513 6457 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0217 15:21:26.110612 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 15:21:26.110622 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 15:21:26.110781 6457 factory.go:656] Stopping watch factory\\\\nI0217 15:21:26.110863 6457 ovnkube.go:599] Stopped ovnkube\\\\nI0217 15:21:26.110780 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 15:21:26.110998 6457 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 15:21:26.111240 6457 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:52Z\\\",\\\"message\\\":\\\"hNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.153],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0217 15:21:52.983143 6861 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/packageserver-service for network=default are: map[]\\\\nF0217 15:21:52.983162 6861 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z]\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.123703 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.123739 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.123749 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.123765 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.123776 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.159625 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 15:39:29.961750259 +0000 UTC Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.160939 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:54 crc kubenswrapper[4806]: E0217 15:21:54.161077 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.226589 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.226627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.226635 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.226651 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.226661 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.329393 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.329482 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.329502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.329528 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.329548 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.432338 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.432397 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.432441 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.432465 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.432484 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.535500 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.535568 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.535585 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.535612 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.535635 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.639558 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.639614 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.639633 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.639663 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.639681 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.667307 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/3.log" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.672969 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:21:54 crc kubenswrapper[4806]: E0217 15:21:54.673228 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.694497 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.733100 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.743239 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.743335 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.743360 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.743393 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.743455 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.755036 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.776476 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.794878 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.812879 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.833538 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.846414 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.846677 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.846845 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.846976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.847079 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.852788 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.881013 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:52Z\\\",\\\"message\\\":\\\"hNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.153],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0217 15:21:52.983143 6861 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/packageserver-service for network=default are: map[]\\\\nF0217 15:21:52.983162 6861 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z]\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.904757 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.929874 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.950870 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.951203 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.951322 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.951474 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.951590 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:54Z","lastTransitionTime":"2026-02-17T15:21:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.960843 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.980018 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:49Z\\\",\\\"message\\\":\\\"2026-02-17T15:21:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3\\\\n2026-02-17T15:21:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3 to /host/opt/cni/bin/\\\\n2026-02-17T15:21:04Z [verbose] multus-daemon started\\\\n2026-02-17T15:21:04Z [verbose] Readiness Indicator file check\\\\n2026-02-17T15:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:54 crc kubenswrapper[4806]: I0217 15:21:54.999083 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:54Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.017384 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:55Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.032064 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:55Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.048368 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:55Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.054360 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.054425 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.054442 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.054462 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.054479 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.071269 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:55Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.158323 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.158380 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.158428 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.158456 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.158474 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.160663 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 06:54:43.451407977 +0000 UTC Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.160795 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.160844 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:55 crc kubenswrapper[4806]: E0217 15:21:55.160937 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.161040 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:55 crc kubenswrapper[4806]: E0217 15:21:55.161085 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:55 crc kubenswrapper[4806]: E0217 15:21:55.161319 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.263102 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.263176 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.263205 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.263238 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.263264 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.366669 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.366739 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.366757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.366783 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.366804 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.469982 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.470032 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.470045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.470069 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.470083 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.572905 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.572949 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.572961 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.572980 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.572993 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.675928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.675987 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.676004 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.676027 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.676047 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.779588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.779673 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.779698 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.779730 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.779748 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.882930 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.883008 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.883034 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.883066 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.883093 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.986654 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.986733 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.986758 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.986788 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:55 crc kubenswrapper[4806]: I0217 15:21:55.986812 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:55Z","lastTransitionTime":"2026-02-17T15:21:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.090842 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.090891 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.090918 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.090943 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.090963 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.129879 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.130003 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.130038 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.130066 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.130089 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: E0217 15:21:56.153826 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:56Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.159026 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.159080 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.159099 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.159126 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.159143 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.160489 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:56 crc kubenswrapper[4806]: E0217 15:21:56.160673 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.160952 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 15:24:53.51030979 +0000 UTC Feb 17 15:21:56 crc kubenswrapper[4806]: E0217 15:21:56.179543 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:56Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.184652 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.184714 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.184730 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.184754 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.184772 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: E0217 15:21:56.210589 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:56Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.216335 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.216370 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.216385 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.216428 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.216446 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: E0217 15:21:56.235362 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:56Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.239969 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.240004 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.240017 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.240036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.240051 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: E0217 15:21:56.258666 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:56Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:56 crc kubenswrapper[4806]: E0217 15:21:56.258880 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.261012 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.261067 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.261078 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.261102 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.261115 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.363767 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.363796 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.363805 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.363820 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.363829 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.467771 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.467840 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.467863 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.467891 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.467912 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.570865 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.570936 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.570957 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.570981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.570998 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.674438 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.674499 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.674519 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.674544 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.674563 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.777925 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.777992 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.778012 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.778038 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.778057 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.880661 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.880720 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.880736 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.880761 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.880782 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.984081 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.984166 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.984190 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.984226 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:56 crc kubenswrapper[4806]: I0217 15:21:56.984251 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:56Z","lastTransitionTime":"2026-02-17T15:21:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.087523 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.087596 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.087616 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.087643 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.087662 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.160861 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.160980 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.161058 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:11:58.593722551 +0000 UTC Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.160857 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:57 crc kubenswrapper[4806]: E0217 15:21:57.161165 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:57 crc kubenswrapper[4806]: E0217 15:21:57.161231 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:57 crc kubenswrapper[4806]: E0217 15:21:57.161358 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.191316 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.191360 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.191374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.191391 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.191426 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.294493 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.294566 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.294591 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.294623 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.294642 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.398428 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.398491 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.398512 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.398539 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.398557 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.502190 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.502270 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.502292 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.502322 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.502343 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.605904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.605971 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.605988 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.606016 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.606066 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.709134 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.709203 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.709219 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.709245 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.709263 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.812145 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.812204 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.812221 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.812249 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.812267 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.915518 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.915587 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.915604 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.915629 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:57 crc kubenswrapper[4806]: I0217 15:21:57.915648 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:57Z","lastTransitionTime":"2026-02-17T15:21:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.019120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.019219 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.019241 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.019267 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.019285 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.122731 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.122788 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.122807 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.122833 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.122852 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.160869 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:21:58 crc kubenswrapper[4806]: E0217 15:21:58.160997 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.161958 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 01:35:13.259565857 +0000 UTC Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.225262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.225309 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.225321 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.225337 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.225348 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.328459 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.328519 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.328534 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.328562 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.328580 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.431450 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.431800 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.431828 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.431862 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.431884 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.534934 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.535005 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.535023 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.535050 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.535068 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.643078 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.643766 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.643814 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.643842 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.643861 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.747771 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.747875 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.747922 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.747945 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.747964 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.850965 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.851037 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.851055 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.851080 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.851098 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.954333 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.954451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.954478 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.954515 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:58 crc kubenswrapper[4806]: I0217 15:21:58.954538 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:58Z","lastTransitionTime":"2026-02-17T15:21:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.057150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.057202 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.057216 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.057237 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.057250 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.160279 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.160278 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:21:59 crc kubenswrapper[4806]: E0217 15:21:59.160589 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.160706 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:21:59 crc kubenswrapper[4806]: E0217 15:21:59.160852 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.160940 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.160991 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.161016 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.161045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: E0217 15:21:59.160947 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.161067 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.162813 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 03:42:46.642771189 +0000 UTC Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.194131 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:52Z\\\",\\\"message\\\":\\\"hNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.153],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0217 15:21:52.983143 6861 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/packageserver-service for network=default are: map[]\\\\nF0217 15:21:52.983162 6861 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z]\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.215136 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.233301 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.257539 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.263559 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.263631 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.263657 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.263693 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.263716 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.285288 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.308592 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:49Z\\\",\\\"message\\\":\\\"2026-02-17T15:21:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3\\\\n2026-02-17T15:21:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3 to /host/opt/cni/bin/\\\\n2026-02-17T15:21:04Z [verbose] multus-daemon started\\\\n2026-02-17T15:21:04Z [verbose] Readiness Indicator file check\\\\n2026-02-17T15:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.328932 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.343268 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.357955 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.365711 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.365775 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.365786 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.365826 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.365838 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.371153 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.391789 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.411280 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.428245 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.449869 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.462105 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.468872 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.469150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.469169 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.469191 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.469209 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.477769 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.489235 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.521278 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:59Z is after 2025-08-24T17:21:41Z" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.572459 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.572529 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.572553 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.572581 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.572604 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.674883 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.674961 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.674985 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.675013 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.675034 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.778735 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.778782 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.778794 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.778811 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.778822 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.881206 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.881252 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.881270 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.881293 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.881312 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.984751 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.984833 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.984852 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.984877 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:21:59 crc kubenswrapper[4806]: I0217 15:21:59.984900 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:21:59Z","lastTransitionTime":"2026-02-17T15:21:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.087812 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.087880 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.087900 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.087925 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.087942 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.160325 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:00 crc kubenswrapper[4806]: E0217 15:22:00.160570 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.163626 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 04:10:09.23843304 +0000 UTC Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.192173 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.192260 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.192284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.192323 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.192346 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.296345 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.296473 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.296495 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.296527 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.296551 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.399787 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.399854 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.399874 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.399897 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.399915 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.502850 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.502918 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.502932 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.502960 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.502974 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.606005 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.606074 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.606091 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.606119 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.606138 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.709001 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.709051 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.709063 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.709082 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.709092 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.812087 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.812129 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.812141 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.812158 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.812169 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.915586 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.915645 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.915665 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.915692 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:00 crc kubenswrapper[4806]: I0217 15:22:00.915710 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:00Z","lastTransitionTime":"2026-02-17T15:22:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.018269 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.018323 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.018335 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.018382 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.018398 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.121537 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.121602 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.121626 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.121653 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.121670 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.160775 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.160802 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:01 crc kubenswrapper[4806]: E0217 15:22:01.160915 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.161051 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:01 crc kubenswrapper[4806]: E0217 15:22:01.161178 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:01 crc kubenswrapper[4806]: E0217 15:22:01.161284 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.163738 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 10:49:31.317789883 +0000 UTC Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.224846 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.224897 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.224910 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.224932 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.224946 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.328280 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.328367 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.328382 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.328557 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.328574 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.432954 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.433056 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.433081 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.433113 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.433136 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.537081 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.537140 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.537157 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.537184 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.537202 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.640750 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.640820 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.640843 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.640876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.640896 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.743066 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.743121 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.743135 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.743159 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.743172 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.846020 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.846068 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.846084 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.846105 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.846120 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.949259 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.949314 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.949331 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.949385 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:01 crc kubenswrapper[4806]: I0217 15:22:01.949440 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:01Z","lastTransitionTime":"2026-02-17T15:22:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.053200 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.053257 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.053271 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.053292 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.053310 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.156796 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.156847 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.156858 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.156876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.156888 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.160531 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:02 crc kubenswrapper[4806]: E0217 15:22:02.160702 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.164719 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 18:46:12.034999318 +0000 UTC Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.260263 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.260346 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.260371 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.260446 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.260478 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.363714 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.363757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.363768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.363784 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.363798 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.466519 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.466573 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.466608 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.466627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.466680 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.569884 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.569955 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.569966 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.569981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.569996 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.673579 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.673634 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.673646 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.673666 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.673680 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.776987 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.777058 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.777076 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.777101 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.777120 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.880718 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.880773 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.880787 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.880808 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.880820 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.992924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.993013 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.993038 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.993070 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:02 crc kubenswrapper[4806]: I0217 15:22:02.993093 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:02Z","lastTransitionTime":"2026-02-17T15:22:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.016377 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.016626 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.016597444 +0000 UTC m=+148.547227885 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.097112 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.097186 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.097204 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.097239 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.097268 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.118144 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.118217 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.118258 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.118336 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118481 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118504 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118529 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118553 4806 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118534 4806 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118573 4806 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118496 4806 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118556 4806 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118637 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.118614073 +0000 UTC m=+148.649244484 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118750 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.118737736 +0000 UTC m=+148.649368147 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118770 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.118761397 +0000 UTC m=+148.649391808 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.118787 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.118779788 +0000 UTC m=+148.649410199 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.160764 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.160836 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.160892 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.160925 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.161087 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:03 crc kubenswrapper[4806]: E0217 15:22:03.161303 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.165525 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 18:05:02.100021341 +0000 UTC Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.201505 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.201579 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.201598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.201630 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.201649 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.304790 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.304864 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.304885 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.304917 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.304937 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.408371 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.408459 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.408479 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.408506 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.408524 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.511802 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.511870 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.511888 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.511912 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.511930 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.614851 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.614924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.614952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.615014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.615036 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.718252 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.718744 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.718962 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.719226 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.719513 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.822925 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.823312 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.823397 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.823499 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.823604 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.926502 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.926577 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.926595 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.926627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:03 crc kubenswrapper[4806]: I0217 15:22:03.926649 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:03Z","lastTransitionTime":"2026-02-17T15:22:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.029661 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.030206 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.030233 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.030263 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.030292 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.134014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.134078 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.134096 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.134124 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.134143 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.160558 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:04 crc kubenswrapper[4806]: E0217 15:22:04.160749 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.166657 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 12:44:34.984280607 +0000 UTC Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.246198 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.246279 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.246303 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.246337 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.246363 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.350162 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.350451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.350597 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.350737 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.350866 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.454053 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.454248 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.454439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.454599 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.454775 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.557732 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.557793 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.557812 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.557839 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.557859 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.661598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.662030 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.662244 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.662481 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.662680 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.765177 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.765246 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.765269 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.765302 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.765326 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.868246 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.868296 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.868349 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.868374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.868391 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.971326 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.971516 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.971551 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.971841 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:04 crc kubenswrapper[4806]: I0217 15:22:04.971875 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:04Z","lastTransitionTime":"2026-02-17T15:22:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.074847 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.074920 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.074938 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.074963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.074980 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.160624 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.160694 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.160813 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:05 crc kubenswrapper[4806]: E0217 15:22:05.160991 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:05 crc kubenswrapper[4806]: E0217 15:22:05.161181 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:05 crc kubenswrapper[4806]: E0217 15:22:05.161294 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.167236 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 23:13:26.012373574 +0000 UTC Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.177632 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.177669 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.177688 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.177712 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.177729 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.280226 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.280273 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.280284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.280311 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.280326 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.382979 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.383019 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.383031 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.383049 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.383062 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.486698 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.486749 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.486770 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.486796 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.486814 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.589056 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.589144 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.589161 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.589186 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.589204 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.691339 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.691397 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.691442 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.691469 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.691489 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.794654 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.794716 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.794744 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.794773 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.794796 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.898699 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.898768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.898787 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.898813 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:05 crc kubenswrapper[4806]: I0217 15:22:05.898831 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:05Z","lastTransitionTime":"2026-02-17T15:22:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.001479 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.002011 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.002219 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.002375 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.002549 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.106024 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.106098 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.106123 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.106161 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.106188 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.160214 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.160473 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.162581 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.163027 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.167376 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 06:16:52.727031614 +0000 UTC Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.208877 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.208935 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.208957 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.208984 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.209007 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.286028 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.286108 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.286130 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.286159 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.286186 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.308721 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.315090 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.315155 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.315173 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.315198 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.315216 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.335322 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.340768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.340823 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.340840 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.340865 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.340885 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.362921 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.369097 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.369237 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.369268 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.369344 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.369377 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.393537 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.398908 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.398963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.398980 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.399003 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.399021 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.418480 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:06Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:06 crc kubenswrapper[4806]: E0217 15:22:06.418708 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.420981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.421031 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.421049 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.421074 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.421091 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.524367 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.524465 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.524484 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.524544 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.524567 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.628192 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.628266 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.628283 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.628308 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.628325 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.731344 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.731443 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.731469 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.731500 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.731521 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.834550 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.834622 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.834641 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.834667 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.834686 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.937504 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.937575 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.937598 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.937630 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:06 crc kubenswrapper[4806]: I0217 15:22:06.937652 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:06Z","lastTransitionTime":"2026-02-17T15:22:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.040936 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.041011 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.041035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.041066 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.041119 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.144137 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.144202 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.144223 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.144249 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.144267 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.161048 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.161093 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:07 crc kubenswrapper[4806]: E0217 15:22:07.161158 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.161171 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:07 crc kubenswrapper[4806]: E0217 15:22:07.161297 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:07 crc kubenswrapper[4806]: E0217 15:22:07.161574 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.168105 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 04:29:07.085608204 +0000 UTC Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.247680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.247761 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.247782 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.247807 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.247825 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.350993 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.351456 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.351616 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.351757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.351906 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.455469 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.455535 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.455553 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.455580 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.455598 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.558226 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.558284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.558304 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.558328 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.558346 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.661337 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.661464 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.661491 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.661522 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.661546 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.764968 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.765033 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.765051 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.765075 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.765091 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.868349 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.868451 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.868478 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.868510 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.868567 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.971284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.971347 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.971368 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.971400 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:07 crc kubenswrapper[4806]: I0217 15:22:07.971483 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:07Z","lastTransitionTime":"2026-02-17T15:22:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.074850 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.074904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.074914 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.074931 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.074943 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.160487 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:08 crc kubenswrapper[4806]: E0217 15:22:08.160673 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.168702 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 16:23:31.806976934 +0000 UTC Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.178629 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.178663 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.178671 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.178684 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.178695 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.281080 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.281150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.281160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.281175 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.281184 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.383476 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.384165 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.384341 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.384535 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.384671 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.487750 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.487780 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.487789 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.487803 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.487816 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.590202 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.590291 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.590358 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.590387 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.590462 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.693978 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.694054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.694078 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.694108 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.694130 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.797243 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.797327 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.797354 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.797388 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.797443 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.900688 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.900746 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.900763 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.900787 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:08 crc kubenswrapper[4806]: I0217 15:22:08.900808 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:08Z","lastTransitionTime":"2026-02-17T15:22:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.004234 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.004293 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.004311 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.004335 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.004354 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.107680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.107772 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.107797 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.107830 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.107853 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.160714 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.160972 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.160971 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:09 crc kubenswrapper[4806]: E0217 15:22:09.161121 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:09 crc kubenswrapper[4806]: E0217 15:22:09.161213 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:09 crc kubenswrapper[4806]: E0217 15:22:09.161374 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.169184 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:50:06.381135005 +0000 UTC Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.182827 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59b65a3f-22cb-49c2-a3b6-1ff988aa300e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a3fbfb37b5414ec025d5f43d917070a6f184ed43b97afa18fb0ecf172be8941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e132d70a29452fa7e457f21b3e8a7b50f23e464d1b8d5ebc69dbac155fb3aeb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6f267c7f51124539d06dc56601c238878e17ec071a2777d33421a9460cfc2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3fe291b3dc0fece5f5165f81e51e61e6f9d815d25e1b53c3d61066ad6c311e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.210954 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.211016 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.211035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.211060 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.211115 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.226647 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc05033-1332-4e64-8c8c-6db0924032fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7226f345bf43aeb5538fb87f1b04f8a0f1fabb87b93cdff58d8e2f7e8088da8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae71ef00f7465214b398c4a008518808504135fecaabffe60c19168ac24778e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a2369f181ed914a8e7e7fa292a73276147d0dc088dc71e9820f78b004f6775b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a57e2fddd3f9c1016415c3793016093eb4ac08eaa0ac2e6efb67f802b4b9359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2675e3ab47082616721c8effb53a1d973fb6b6bc5709ab894ed2b1c6b586a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56e50a1d6032c7f8599505511987bed665d5caa473892c1f26762c15573c0eff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://526ddb29a2bf9bb139e325d2459c8298c897144cd0cd1dcac22f728691b69fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c0d3525c1f67f2f50fcfcda722242fdb5205d8deff43bbe5f0b381971c8f13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.244564 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e25574c3b09af0947fdb01598c3d52fed2be9d89d35bd1c6de3ba38feec3d4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.265256 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d4f2998736899daf51be1da3ba943cf2d0b8b5fa329b7dba3fbd8ea408508b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66cb9fe2eef33025caf7d95e2097aea70fe687c873a66aa74f5ac6d04386de85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.285511 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-tjnkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aff0cd70-eca5-4222-85b8-dd4543122e01\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa1a06558d87b6ed95a69a5af3f696e1b4ea631ca6537676e1cfbd676729a245\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-tjnkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.304247 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lvlwv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e61ad46f-e059-42a8-a36b-cf791e3bf196\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74bd000e858e99d571fbed9f3623baaf3c8866a87c6ed1b37c34a6d562dc2b59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cbp9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lvlwv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.313695 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.313745 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.313757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.313776 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.313790 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.323782 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.339393 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"888ccee0-4c6b-45ea-9d8c-00668327ca0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4098da79f5dc68619c65eb0b9ecc1aa1c788e0a47cdb41ed3c8610453763f5fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jwndx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.368396 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:52Z\\\",\\\"message\\\":\\\"hNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.153],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0217 15:21:52.983143 6861 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/packageserver-service for network=default are: map[]\\\\nF0217 15:21:52.983162 6861 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:21:52Z is after 2025-08-24T17:21:41Z]\\\\nI02\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bk7gh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-2m855\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.385786 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f610f849-5b18-4da6-9acb-fb2f81e87834\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d01315e50e4cb85772467e279d0f363a979fb861a399a39e39acabfb21cebed3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4acdf4a7a70b8db5991056c14d259b0942780683bdf7abb7d2c2c9cb6230ee25\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b624dac2479f0b9ac01d97cd5ff0b13a7666b65a4a8c910f20140654d334b8d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.402792 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d0efddb4d4e686bf58342f3e74448237baccf8fffcf83708c140dc6b098520c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.417192 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.417327 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.417560 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.417671 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.417780 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.423208 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a981fc6-90ce-4056-b041-a0089f3b40f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://925fb15f57fcbdffbf923e1fa30e8a20fa8a746e9d53608bcd9ab3a17b790d0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8b610f0205bbf141a4e8fc8e8271872ceffa03226845299bcf358ae92b30c61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4140f44fa8ef102c344d6a592148182b4daba8ddc5fa30b30fc437ae8bafde8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f09cfb11768ac71cffe68ecc9f394b6ec960cbe355c34776a382837b7d4a9838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2f70e1a451f6da3a04397870d73c03bf4cff7ca9487e6647035dbcab0ed05ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39092d84db2859856800a1ea141c05d863b5c6f5016a500aed633d30e42ec366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea5cc0d6bbff0a95c2d3e525184129a72cbe82050ba743b60a40d1898ac5fb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:21:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x24cx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r9b8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.443755 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wgg2s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"344f8a87-e00f-4f0a-a0bc-aee197271160\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T15:21:49Z\\\",\\\"message\\\":\\\"2026-02-17T15:21:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3\\\\n2026-02-17T15:21:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b7f3af2e-d3a0-40d7-b486-5731e19af1d3 to /host/opt/cni/bin/\\\\n2026-02-17T15:21:04Z [verbose] multus-daemon started\\\\n2026-02-17T15:21:04Z [verbose] Readiness Indicator file check\\\\n2026-02-17T15:21:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:21:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5kbf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wgg2s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.467101 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"le observer\\\\nW0217 15:20:58.924949 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0217 15:20:58.925126 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 15:20:58.926176 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1558749531/tls.crt::/tmp/serving-cert-1558749531/tls.key\\\\\\\"\\\\nI0217 15:20:59.416785 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 15:20:59.430085 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 15:20:59.430111 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 15:20:59.430134 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 15:20:59.430140 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 15:20:59.438173 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0217 15:20:59.438186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 15:20:59.438207 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438214 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 15:20:59.438222 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 15:20:59.438225 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 15:20:59.438229 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 15:20:59.438233 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 15:20:59.447229 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:20:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T15:20:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T15:20:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:20:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.484160 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.501925 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T15:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.518885 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f73041c-6d45-4e20-b119-00a5feae4d58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e23b345d47443a42317f47f4fcc86d56130f10f296f79a3bccef44384e5e08b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0733cc964de8f3d3cc6c7a01326c84239466338592237d2eb3c0b8813db6952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T15:21:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4jnpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-jmb6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.521081 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.521129 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.521143 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.521164 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.521175 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.533630 4806 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h72qm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5af69f46-757a-4fab-adbd-d7a278868c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T15:21:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rj58x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T15:21:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h72qm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:09Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.624369 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.624462 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.624485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.624515 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.624538 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.727239 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.727344 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.727370 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.727466 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.727491 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.829948 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.830007 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.830019 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.830035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.830048 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.933191 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.933288 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.933312 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.933339 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:09 crc kubenswrapper[4806]: I0217 15:22:09.933360 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:09Z","lastTransitionTime":"2026-02-17T15:22:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.036284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.036338 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.036362 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.036388 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.036437 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.139130 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.139188 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.139205 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.139228 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.139245 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.160317 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:10 crc kubenswrapper[4806]: E0217 15:22:10.160548 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.169493 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 03:19:19.355224496 +0000 UTC Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.242219 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.242279 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.242302 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.242333 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.242355 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.345930 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.345999 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.346024 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.346053 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.346071 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.449446 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.449519 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.449545 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.449588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.449620 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.553168 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.553228 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.553246 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.553270 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.553288 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.657002 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.657054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.657070 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.657095 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.657113 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.760295 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.760507 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.760561 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.760592 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.760614 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.865668 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.865737 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.865757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.865786 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.865804 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.968671 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.968756 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.968773 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.968800 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:10 crc kubenswrapper[4806]: I0217 15:22:10.968818 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:10Z","lastTransitionTime":"2026-02-17T15:22:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.072707 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.072765 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.072782 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.072808 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.072826 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.160869 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:11 crc kubenswrapper[4806]: E0217 15:22:11.161070 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.161387 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:11 crc kubenswrapper[4806]: E0217 15:22:11.161519 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.161735 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:11 crc kubenswrapper[4806]: E0217 15:22:11.161849 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.170399 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:55:31.801661716 +0000 UTC Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.175264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.175317 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.175336 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.175359 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.175378 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.278503 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.278562 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.278579 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.278603 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.278623 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.381893 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.381955 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.381973 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.382000 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.382019 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.485976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.486063 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.486087 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.486118 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.486138 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.589859 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.589928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.589952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.589983 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.590006 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.693197 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.693262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.693281 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.693307 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.693326 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.796346 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.796438 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.796459 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.796486 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.796505 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.899281 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.899372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.899392 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.899442 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:11 crc kubenswrapper[4806]: I0217 15:22:11.899461 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:11Z","lastTransitionTime":"2026-02-17T15:22:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.003029 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.003119 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.003144 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.003170 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.003188 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.106276 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.106359 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.106377 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.106433 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.106452 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.160881 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:12 crc kubenswrapper[4806]: E0217 15:22:12.161114 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.171279 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 06:49:53.27882196 +0000 UTC Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.176054 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.209511 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.209589 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.209617 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.209649 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.209672 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.312525 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.312603 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.312627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.312658 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.312682 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.415975 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.416030 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.416052 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.416083 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.416108 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.519266 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.519339 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.519358 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.519386 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.519431 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.622814 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.622892 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.622933 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.622987 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.623008 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.726388 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.726504 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.726522 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.726548 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.726571 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.830113 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.830172 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.830184 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.830201 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.830213 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.932959 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.933017 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.933034 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.933058 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:12 crc kubenswrapper[4806]: I0217 15:22:12.933076 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:12Z","lastTransitionTime":"2026-02-17T15:22:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.043856 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.043924 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.043942 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.043965 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.043984 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.147730 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.147801 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.147820 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.147846 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.147864 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.159988 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.160056 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:13 crc kubenswrapper[4806]: E0217 15:22:13.160207 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.160271 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:13 crc kubenswrapper[4806]: E0217 15:22:13.160521 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:13 crc kubenswrapper[4806]: E0217 15:22:13.160650 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.171743 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 01:57:51.619409832 +0000 UTC Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.251102 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.251174 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.251193 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.251217 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.251234 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.354642 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.354711 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.354729 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.354756 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.354775 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.457984 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.458065 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.458125 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.458158 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.458179 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.561724 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.561777 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.561796 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.561821 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.561837 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.664694 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.664746 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.664765 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.664789 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.664805 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.766959 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.766999 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.767011 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.767027 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.767039 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.870526 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.870623 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.870644 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.870666 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.870684 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.974605 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.974680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.974698 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.975079 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:13 crc kubenswrapper[4806]: I0217 15:22:13.975126 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:13Z","lastTransitionTime":"2026-02-17T15:22:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.078810 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.078863 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.078880 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.078904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.078921 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.160801 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:14 crc kubenswrapper[4806]: E0217 15:22:14.161046 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.171835 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 13:07:53.784471318 +0000 UTC Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.182092 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.182146 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.182163 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.182187 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.182209 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.286024 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.286106 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.286129 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.286160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.286182 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.390145 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.390214 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.390236 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.390262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.390280 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.494296 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.494352 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.494368 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.494392 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.494446 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.597287 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.597343 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.597356 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.597375 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.597388 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.700341 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.700391 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.700426 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.700450 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.700466 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.802785 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.802826 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.802836 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.802856 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.802869 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.905748 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.905793 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.905816 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.905840 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:14 crc kubenswrapper[4806]: I0217 15:22:14.905856 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:14Z","lastTransitionTime":"2026-02-17T15:22:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.008366 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.008497 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.008520 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.008558 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.008583 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.111809 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.111883 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.111902 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.111931 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.111951 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.161141 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.161265 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:15 crc kubenswrapper[4806]: E0217 15:22:15.161352 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.161378 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:15 crc kubenswrapper[4806]: E0217 15:22:15.161471 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:15 crc kubenswrapper[4806]: E0217 15:22:15.161614 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.172740 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 21:52:26.229468564 +0000 UTC Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.216486 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.216562 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.216583 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.216612 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.216633 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.319973 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.320048 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.320079 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.320112 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.320137 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.424163 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.424273 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.424298 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.424331 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.424358 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.527655 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.527705 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.527715 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.527731 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.527743 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.630773 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.630839 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.630856 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.630881 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.630897 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.734565 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.734672 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.734683 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.734706 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.734727 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.837725 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.837820 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.837839 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.837866 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.837887 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.941831 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.941977 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.941991 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.942011 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:15 crc kubenswrapper[4806]: I0217 15:22:15.942050 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:15Z","lastTransitionTime":"2026-02-17T15:22:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.046300 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.046371 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.046403 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.046458 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.046477 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.150579 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.150680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.150703 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.150762 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.150782 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.161090 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:16 crc kubenswrapper[4806]: E0217 15:22:16.161336 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.172924 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 15:28:35.83599309 +0000 UTC Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.254175 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.254244 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.254261 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.254285 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.254302 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.357779 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.357856 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.357879 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.357905 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.357924 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.460855 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.460939 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.460959 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.460987 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.461007 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.539120 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.539201 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.539217 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.539243 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.539260 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: E0217 15:22:16.566145 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.571811 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.571860 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.571873 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.571896 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.571911 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: E0217 15:22:16.592729 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.598205 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.598264 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.598277 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.598300 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.598312 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: E0217 15:22:16.618025 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.622901 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.622936 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.622945 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.622960 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.622972 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: E0217 15:22:16.641797 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.647439 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.647497 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.647514 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.647537 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.647548 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: E0217 15:22:16.666826 4806 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T15:22:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0e8a92da-ef57-4d82-8286-19572da4098f\\\",\\\"systemUUID\\\":\\\"aa772b6b-8722-482a-a8e2-1dcbd24be6c8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T15:22:16Z is after 2025-08-24T17:21:41Z" Feb 17 15:22:16 crc kubenswrapper[4806]: E0217 15:22:16.667043 4806 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.669525 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.669574 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.669593 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.669620 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.669639 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.773070 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.773139 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.773150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.773169 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.773182 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.877069 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.877155 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.877179 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.877215 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.877235 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.980713 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.980788 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.980810 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.980837 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:16 crc kubenswrapper[4806]: I0217 15:22:16.980857 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:16Z","lastTransitionTime":"2026-02-17T15:22:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.084518 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.084605 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.084629 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.084656 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.084674 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.161031 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.161131 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.161155 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:17 crc kubenswrapper[4806]: E0217 15:22:17.161339 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:17 crc kubenswrapper[4806]: E0217 15:22:17.161576 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:17 crc kubenswrapper[4806]: E0217 15:22:17.162383 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.163041 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:22:17 crc kubenswrapper[4806]: E0217 15:22:17.163363 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.173554 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:00:44.693449421 +0000 UTC Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.187371 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.187470 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.187498 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.187547 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.187572 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.290776 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.290860 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.290887 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.290914 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.290933 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.394680 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.394752 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.394770 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.394796 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.394815 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.498160 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.498227 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.498245 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.498270 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.498288 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.601880 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.601963 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.601985 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.602018 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.602042 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.705123 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.705225 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.705248 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.705280 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.705302 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.808641 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.808711 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.808732 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.808768 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.808789 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.911988 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.912041 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.912058 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.912080 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:17 crc kubenswrapper[4806]: I0217 15:22:17.912097 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:17Z","lastTransitionTime":"2026-02-17T15:22:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.014973 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.015025 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.015036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.015055 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.015069 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.118479 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.118541 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.118558 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.118583 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.118603 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.160258 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:18 crc kubenswrapper[4806]: E0217 15:22:18.160564 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.173709 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 20:31:35.637156997 +0000 UTC Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.223903 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.223967 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.223989 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.224014 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.224033 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.334153 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.334231 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.334255 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.334287 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.334311 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.437499 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.437590 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.437614 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.437645 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.437666 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.541051 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.541108 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.541127 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.541150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.541166 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.644616 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.644697 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.644722 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.644750 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.644768 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.748172 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.748250 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.748271 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.748300 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.748323 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.797211 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:18 crc kubenswrapper[4806]: E0217 15:22:18.797493 4806 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:22:18 crc kubenswrapper[4806]: E0217 15:22:18.797627 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs podName:5af69f46-757a-4fab-adbd-d7a278868c94 nodeName:}" failed. No retries permitted until 2026-02-17 15:23:22.797595841 +0000 UTC m=+164.328226292 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs") pod "network-metrics-daemon-h72qm" (UID: "5af69f46-757a-4fab-adbd-d7a278868c94") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.851585 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.851652 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.851670 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.851694 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.851712 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.955528 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.955623 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.955650 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.955682 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:18 crc kubenswrapper[4806]: I0217 15:22:18.955703 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:18Z","lastTransitionTime":"2026-02-17T15:22:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.059226 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.059323 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.059345 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.059372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.059393 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.160371 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:19 crc kubenswrapper[4806]: E0217 15:22:19.161150 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.161506 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.161518 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:19 crc kubenswrapper[4806]: E0217 15:22:19.161975 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:19 crc kubenswrapper[4806]: E0217 15:22:19.162003 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.163258 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.163317 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.163341 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.163374 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.163397 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.174441 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 06:36:03.115488988 +0000 UTC Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.193061 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.193033441 podStartE2EDuration="48.193033441s" podCreationTimestamp="2026-02-17 15:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.193035421 +0000 UTC m=+100.723665852" watchObservedRunningTime="2026-02-17 15:22:19.193033441 +0000 UTC m=+100.723663872" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.237778 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.237745873 podStartE2EDuration="1m20.237745873s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.236633744 +0000 UTC m=+100.767264175" watchObservedRunningTime="2026-02-17 15:22:19.237745873 +0000 UTC m=+100.768376324" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.265687 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.266372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.266668 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.266942 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.267181 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.296772 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tjnkx" podStartSLOduration=80.296745543 podStartE2EDuration="1m20.296745543s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.283491992 +0000 UTC m=+100.814122403" watchObservedRunningTime="2026-02-17 15:22:19.296745543 +0000 UTC m=+100.827375994" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.318991 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lvlwv" podStartSLOduration=80.318970696 podStartE2EDuration="1m20.318970696s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.297835961 +0000 UTC m=+100.828466442" watchObservedRunningTime="2026-02-17 15:22:19.318970696 +0000 UTC m=+100.849601107" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.361942 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podStartSLOduration=80.361913103 podStartE2EDuration="1m20.361913103s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.336308053 +0000 UTC m=+100.866938474" watchObservedRunningTime="2026-02-17 15:22:19.361913103 +0000 UTC m=+100.892543534" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.370274 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.370334 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.370347 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.370369 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.370383 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.380696 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.380667396 podStartE2EDuration="1m19.380667396s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.379141517 +0000 UTC m=+100.909771968" watchObservedRunningTime="2026-02-17 15:22:19.380667396 +0000 UTC m=+100.911297837" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.405243 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.405218397 podStartE2EDuration="7.405218397s" podCreationTimestamp="2026-02-17 15:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.392390258 +0000 UTC m=+100.923020669" watchObservedRunningTime="2026-02-17 15:22:19.405218397 +0000 UTC m=+100.935848808" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.443050 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-r9b8d" podStartSLOduration=79.443032172 podStartE2EDuration="1m19.443032172s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.42819413 +0000 UTC m=+100.958824561" watchObservedRunningTime="2026-02-17 15:22:19.443032172 +0000 UTC m=+100.973662583" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.443181 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wgg2s" podStartSLOduration=79.443177586 podStartE2EDuration="1m19.443177586s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.443138975 +0000 UTC m=+100.973769396" watchObservedRunningTime="2026-02-17 15:22:19.443177586 +0000 UTC m=+100.973807997" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.459669 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.45964225 podStartE2EDuration="1m20.45964225s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.45848047 +0000 UTC m=+100.989110901" watchObservedRunningTime="2026-02-17 15:22:19.45964225 +0000 UTC m=+100.990272661" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.472806 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.472882 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.472898 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.472923 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.472941 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.511455 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jmb6j" podStartSLOduration=79.511432584 podStartE2EDuration="1m19.511432584s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:19.51046722 +0000 UTC m=+101.041097671" watchObservedRunningTime="2026-02-17 15:22:19.511432584 +0000 UTC m=+101.042063005" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.575517 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.575571 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.575583 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.575602 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.575614 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.678641 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.678706 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.678720 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.678739 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.678750 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.781394 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.781834 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.782001 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.782207 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.782416 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.885969 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.886290 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.886375 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.886560 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.886674 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.990318 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.990361 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.990372 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.990393 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:19 crc kubenswrapper[4806]: I0217 15:22:19.990436 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:19Z","lastTransitionTime":"2026-02-17T15:22:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.093556 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.093645 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.093672 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.093703 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.093723 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.160532 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:20 crc kubenswrapper[4806]: E0217 15:22:20.160733 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.175045 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 18:32:00.654029592 +0000 UTC Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.196687 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.196949 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.197145 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.197331 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.197540 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.301332 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.301736 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.301918 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.302083 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.302230 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.405926 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.406002 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.406056 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.406085 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.406102 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.509205 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.509266 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.509283 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.509311 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.509330 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.612100 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.612159 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.612182 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.612210 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.612233 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.715155 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.715229 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.715254 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.715284 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.715310 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.817952 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.818022 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.818045 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.818074 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.818097 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.922297 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.922366 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.922382 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.922441 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:20 crc kubenswrapper[4806]: I0217 15:22:20.922463 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:20Z","lastTransitionTime":"2026-02-17T15:22:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.025389 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.025473 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.025490 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.025513 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.025530 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.128965 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.129028 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.129052 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.129083 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.129106 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.160888 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.161004 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:21 crc kubenswrapper[4806]: E0217 15:22:21.161103 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.161181 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:21 crc kubenswrapper[4806]: E0217 15:22:21.161497 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:21 crc kubenswrapper[4806]: E0217 15:22:21.161622 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.176295 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 09:32:15.904157105 +0000 UTC Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.232563 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.232617 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.232634 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.232656 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.232675 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.335464 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.335518 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.335532 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.335550 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.335561 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.437453 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.437497 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.437509 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.437528 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.437539 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.540548 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.540595 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.540608 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.540628 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.540637 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.643316 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.643385 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.643450 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.643487 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.643540 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.746837 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.746922 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.746945 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.746978 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.746999 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.850025 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.850087 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.850106 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.850133 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.850150 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.953822 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.953883 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.953904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.953928 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:21 crc kubenswrapper[4806]: I0217 15:22:21.953948 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:21Z","lastTransitionTime":"2026-02-17T15:22:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.057396 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.057499 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.057519 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.057550 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.057571 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.159991 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:22 crc kubenswrapper[4806]: E0217 15:22:22.160211 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.160565 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.160630 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.160642 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.160661 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.160672 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.176463 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 22:16:29.608359322 +0000 UTC Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.264596 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.264668 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.264677 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.264700 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.264715 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.366861 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.366932 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.366955 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.366984 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.367006 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.470976 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.471036 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.471054 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.471079 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.471097 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.574837 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.574908 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.574934 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.574965 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.574986 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.677492 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.677561 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.677588 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.677619 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.677643 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.780400 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.780475 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.780496 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.780518 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.780533 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.883002 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.883062 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.883100 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.883118 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.883128 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.986105 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.986136 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.986147 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.986162 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:22 crc kubenswrapper[4806]: I0217 15:22:22.986172 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:22Z","lastTransitionTime":"2026-02-17T15:22:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.089169 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.089230 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.089253 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.089279 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.089296 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.159943 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.159988 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.160086 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:23 crc kubenswrapper[4806]: E0217 15:22:23.160237 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:23 crc kubenswrapper[4806]: E0217 15:22:23.160396 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:23 crc kubenswrapper[4806]: E0217 15:22:23.160708 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.177429 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 05:14:09.265735019 +0000 UTC Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.192734 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.192812 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.192837 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.192869 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.192892 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.295939 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.295981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.295995 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.296013 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.296025 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.399214 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.399272 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.399291 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.399315 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.399333 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.501818 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.501878 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.501895 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.501923 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.501940 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.605664 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.605744 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.605767 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.605800 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.605820 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.708773 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.708859 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.708881 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.708904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.708921 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.812650 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.812721 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.812740 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.812764 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.812781 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.916140 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.916192 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.916209 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.916230 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:23 crc kubenswrapper[4806]: I0217 15:22:23.916247 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:23Z","lastTransitionTime":"2026-02-17T15:22:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.019955 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.020033 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.020052 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.020076 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.020115 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.123467 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.123540 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.123558 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.123584 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.123639 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.160657 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:24 crc kubenswrapper[4806]: E0217 15:22:24.160784 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.178501 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 18:57:37.400366895 +0000 UTC Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.226610 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.226664 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.226683 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.226709 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.226727 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.329955 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.330031 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.330055 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.330085 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.330108 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.433208 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.433262 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.433281 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.433305 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.433325 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.536449 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.536487 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.536499 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.536516 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.536529 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.640200 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.640281 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.640305 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.640334 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.640358 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.743839 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.743927 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.743950 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.743978 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.743999 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.846835 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.846891 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.846906 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.846933 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.846951 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.950280 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.950528 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.950561 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.950597 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:24 crc kubenswrapper[4806]: I0217 15:22:24.950623 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:24Z","lastTransitionTime":"2026-02-17T15:22:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.053649 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.053700 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.053719 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.053741 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.053754 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.157337 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.157405 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.157475 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.157509 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.157530 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.160936 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.160987 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.161107 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:25 crc kubenswrapper[4806]: E0217 15:22:25.161165 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:25 crc kubenswrapper[4806]: E0217 15:22:25.161271 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:25 crc kubenswrapper[4806]: E0217 15:22:25.161396 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.179485 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 22:36:54.231431426 +0000 UTC Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.260698 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.260742 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.260757 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.260780 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.260792 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.364278 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.364354 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.364380 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.364456 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.364486 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.467366 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.467468 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.467485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.467501 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.467513 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.570595 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.570667 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.570685 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.570722 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.570743 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.672822 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.672859 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.672870 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.672888 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.672900 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.775956 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.776003 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.776015 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.776032 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.776045 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.878747 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.878822 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.878848 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.878876 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.878893 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.982627 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.982675 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.982685 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.982701 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:25 crc kubenswrapper[4806]: I0217 15:22:25.982714 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:25Z","lastTransitionTime":"2026-02-17T15:22:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.085063 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.085119 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.085131 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.085150 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.085162 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.160098 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:26 crc kubenswrapper[4806]: E0217 15:22:26.160390 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.179693 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 10:15:23.455227284 +0000 UTC Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.188904 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.188959 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.188981 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.189213 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.189262 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.291920 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.291989 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.292009 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.292035 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.292052 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.394676 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.394737 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.394749 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.394777 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.394791 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.498111 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.498174 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.498187 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.498213 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.498231 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.600383 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.600460 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.600473 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.600492 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.600503 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.703496 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.703566 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.703580 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.703595 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.703603 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.805817 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.805858 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.805871 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.805891 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.805906 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.909863 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.909909 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.909931 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.909950 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.909962 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.918485 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.918550 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.918574 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.918603 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 15:22:26 crc kubenswrapper[4806]: I0217 15:22:26.918630 4806 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T15:22:26Z","lastTransitionTime":"2026-02-17T15:22:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.010166 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7"] Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.010652 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.013621 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.013905 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.019297 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.019703 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.095037 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a3a222-19ce-411b-9ec9-dc2590a43e66-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.095101 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79a3a222-19ce-411b-9ec9-dc2590a43e66-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.095141 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a3a222-19ce-411b-9ec9-dc2590a43e66-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.095183 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/79a3a222-19ce-411b-9ec9-dc2590a43e66-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.095219 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/79a3a222-19ce-411b-9ec9-dc2590a43e66-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.161010 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.161107 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.161910 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:27 crc kubenswrapper[4806]: E0217 15:22:27.162070 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:27 crc kubenswrapper[4806]: E0217 15:22:27.162187 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:27 crc kubenswrapper[4806]: E0217 15:22:27.162349 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.180612 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 11:52:59.524837675 +0000 UTC Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.180709 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.192075 4806 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.196468 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/79a3a222-19ce-411b-9ec9-dc2590a43e66-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.196514 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/79a3a222-19ce-411b-9ec9-dc2590a43e66-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.196592 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a3a222-19ce-411b-9ec9-dc2590a43e66-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.196615 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79a3a222-19ce-411b-9ec9-dc2590a43e66-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.196633 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a3a222-19ce-411b-9ec9-dc2590a43e66-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.196631 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/79a3a222-19ce-411b-9ec9-dc2590a43e66-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.196761 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/79a3a222-19ce-411b-9ec9-dc2590a43e66-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.198158 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/79a3a222-19ce-411b-9ec9-dc2590a43e66-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.205734 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79a3a222-19ce-411b-9ec9-dc2590a43e66-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.230231 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a3a222-19ce-411b-9ec9-dc2590a43e66-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fdxr7\" (UID: \"79a3a222-19ce-411b-9ec9-dc2590a43e66\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.326348 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.796865 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" event={"ID":"79a3a222-19ce-411b-9ec9-dc2590a43e66","Type":"ContainerStarted","Data":"1358c33a7998b73138238e215c37146496cd8a7119024da28884fb8ee0a18426"} Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.796959 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" event={"ID":"79a3a222-19ce-411b-9ec9-dc2590a43e66","Type":"ContainerStarted","Data":"6ac0d6886c8ddaea49ed540f8481523841f666d5fa915c50862e960cf74fc149"} Feb 17 15:22:27 crc kubenswrapper[4806]: I0217 15:22:27.819534 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fdxr7" podStartSLOduration=87.819506413 podStartE2EDuration="1m27.819506413s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:27.817888431 +0000 UTC m=+109.348518942" watchObservedRunningTime="2026-02-17 15:22:27.819506413 +0000 UTC m=+109.350136864" Feb 17 15:22:28 crc kubenswrapper[4806]: I0217 15:22:28.161508 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:28 crc kubenswrapper[4806]: I0217 15:22:28.161643 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:22:28 crc kubenswrapper[4806]: E0217 15:22:28.161813 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-2m855_openshift-ovn-kubernetes(1e6a2d66-f11a-48f6-8d86-5295cb917b7f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" Feb 17 15:22:28 crc kubenswrapper[4806]: E0217 15:22:28.161858 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:29 crc kubenswrapper[4806]: I0217 15:22:29.160575 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:29 crc kubenswrapper[4806]: I0217 15:22:29.160648 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:29 crc kubenswrapper[4806]: I0217 15:22:29.162316 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:29 crc kubenswrapper[4806]: E0217 15:22:29.162511 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:29 crc kubenswrapper[4806]: E0217 15:22:29.162692 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:29 crc kubenswrapper[4806]: E0217 15:22:29.162918 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:30 crc kubenswrapper[4806]: I0217 15:22:30.159991 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:30 crc kubenswrapper[4806]: E0217 15:22:30.160280 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:31 crc kubenswrapper[4806]: I0217 15:22:31.160977 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:31 crc kubenswrapper[4806]: I0217 15:22:31.161014 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:31 crc kubenswrapper[4806]: I0217 15:22:31.161058 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:31 crc kubenswrapper[4806]: E0217 15:22:31.161110 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:31 crc kubenswrapper[4806]: E0217 15:22:31.161251 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:31 crc kubenswrapper[4806]: E0217 15:22:31.161457 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:32 crc kubenswrapper[4806]: I0217 15:22:32.160285 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:32 crc kubenswrapper[4806]: E0217 15:22:32.160556 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:33 crc kubenswrapper[4806]: I0217 15:22:33.160744 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:33 crc kubenswrapper[4806]: I0217 15:22:33.160767 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:33 crc kubenswrapper[4806]: E0217 15:22:33.160927 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:33 crc kubenswrapper[4806]: I0217 15:22:33.161023 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:33 crc kubenswrapper[4806]: E0217 15:22:33.161258 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:33 crc kubenswrapper[4806]: E0217 15:22:33.161345 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:34 crc kubenswrapper[4806]: I0217 15:22:34.160979 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:34 crc kubenswrapper[4806]: E0217 15:22:34.161129 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:35 crc kubenswrapper[4806]: I0217 15:22:35.160197 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:35 crc kubenswrapper[4806]: I0217 15:22:35.160590 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:35 crc kubenswrapper[4806]: I0217 15:22:35.160629 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:35 crc kubenswrapper[4806]: E0217 15:22:35.160960 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:35 crc kubenswrapper[4806]: E0217 15:22:35.161113 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:35 crc kubenswrapper[4806]: E0217 15:22:35.161287 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:36 crc kubenswrapper[4806]: I0217 15:22:36.160263 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:36 crc kubenswrapper[4806]: E0217 15:22:36.160415 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:36 crc kubenswrapper[4806]: I0217 15:22:36.832114 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/1.log" Feb 17 15:22:36 crc kubenswrapper[4806]: I0217 15:22:36.832854 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/0.log" Feb 17 15:22:36 crc kubenswrapper[4806]: I0217 15:22:36.832956 4806 generic.go:334] "Generic (PLEG): container finished" podID="344f8a87-e00f-4f0a-a0bc-aee197271160" containerID="4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735" exitCode=1 Feb 17 15:22:36 crc kubenswrapper[4806]: I0217 15:22:36.833007 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerDied","Data":"4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735"} Feb 17 15:22:36 crc kubenswrapper[4806]: I0217 15:22:36.833074 4806 scope.go:117] "RemoveContainer" containerID="1bf163b0adc12b7760745a281ea3d36a14d46f900bd3aef0d6c1c35f85e05fbc" Feb 17 15:22:36 crc kubenswrapper[4806]: I0217 15:22:36.849643 4806 scope.go:117] "RemoveContainer" containerID="4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735" Feb 17 15:22:36 crc kubenswrapper[4806]: E0217 15:22:36.850156 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wgg2s_openshift-multus(344f8a87-e00f-4f0a-a0bc-aee197271160)\"" pod="openshift-multus/multus-wgg2s" podUID="344f8a87-e00f-4f0a-a0bc-aee197271160" Feb 17 15:22:37 crc kubenswrapper[4806]: I0217 15:22:37.160319 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:37 crc kubenswrapper[4806]: I0217 15:22:37.160397 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:37 crc kubenswrapper[4806]: E0217 15:22:37.160588 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:37 crc kubenswrapper[4806]: I0217 15:22:37.160653 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:37 crc kubenswrapper[4806]: E0217 15:22:37.160762 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:37 crc kubenswrapper[4806]: E0217 15:22:37.160840 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:37 crc kubenswrapper[4806]: I0217 15:22:37.838748 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/1.log" Feb 17 15:22:38 crc kubenswrapper[4806]: I0217 15:22:38.160921 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:38 crc kubenswrapper[4806]: E0217 15:22:38.161453 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:39 crc kubenswrapper[4806]: I0217 15:22:39.161138 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:39 crc kubenswrapper[4806]: E0217 15:22:39.161362 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:39 crc kubenswrapper[4806]: I0217 15:22:39.161461 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:39 crc kubenswrapper[4806]: E0217 15:22:39.161627 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:39 crc kubenswrapper[4806]: I0217 15:22:39.162601 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:39 crc kubenswrapper[4806]: E0217 15:22:39.162882 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:39 crc kubenswrapper[4806]: E0217 15:22:39.203045 4806 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 15:22:39 crc kubenswrapper[4806]: E0217 15:22:39.273270 4806 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 15:22:40 crc kubenswrapper[4806]: I0217 15:22:40.160842 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:40 crc kubenswrapper[4806]: E0217 15:22:40.161039 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:41 crc kubenswrapper[4806]: I0217 15:22:41.161457 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:41 crc kubenswrapper[4806]: E0217 15:22:41.161641 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:41 crc kubenswrapper[4806]: I0217 15:22:41.161715 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:41 crc kubenswrapper[4806]: I0217 15:22:41.161741 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:41 crc kubenswrapper[4806]: E0217 15:22:41.161929 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:41 crc kubenswrapper[4806]: E0217 15:22:41.162001 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:42 crc kubenswrapper[4806]: I0217 15:22:42.160618 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:42 crc kubenswrapper[4806]: E0217 15:22:42.160852 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.160052 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.160150 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:43 crc kubenswrapper[4806]: E0217 15:22:43.160264 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.160163 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:43 crc kubenswrapper[4806]: E0217 15:22:43.160547 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:43 crc kubenswrapper[4806]: E0217 15:22:43.160387 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.161790 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.864571 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/3.log" Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.867478 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerStarted","Data":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.868683 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:22:43 crc kubenswrapper[4806]: I0217 15:22:43.906947 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podStartSLOduration=103.906925125 podStartE2EDuration="1m43.906925125s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:22:43.906234318 +0000 UTC m=+125.436864749" watchObservedRunningTime="2026-02-17 15:22:43.906925125 +0000 UTC m=+125.437555546" Feb 17 15:22:44 crc kubenswrapper[4806]: I0217 15:22:44.160653 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:44 crc kubenswrapper[4806]: E0217 15:22:44.160869 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:44 crc kubenswrapper[4806]: I0217 15:22:44.179880 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h72qm"] Feb 17 15:22:44 crc kubenswrapper[4806]: E0217 15:22:44.274849 4806 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 15:22:44 crc kubenswrapper[4806]: I0217 15:22:44.871997 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:44 crc kubenswrapper[4806]: E0217 15:22:44.872934 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:45 crc kubenswrapper[4806]: I0217 15:22:45.160730 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:45 crc kubenswrapper[4806]: I0217 15:22:45.160930 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:45 crc kubenswrapper[4806]: E0217 15:22:45.161043 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:45 crc kubenswrapper[4806]: I0217 15:22:45.161113 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:45 crc kubenswrapper[4806]: E0217 15:22:45.161302 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:45 crc kubenswrapper[4806]: E0217 15:22:45.161537 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:47 crc kubenswrapper[4806]: I0217 15:22:47.160102 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:47 crc kubenswrapper[4806]: I0217 15:22:47.160206 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:47 crc kubenswrapper[4806]: E0217 15:22:47.160291 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:47 crc kubenswrapper[4806]: E0217 15:22:47.160390 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:47 crc kubenswrapper[4806]: I0217 15:22:47.160515 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:47 crc kubenswrapper[4806]: E0217 15:22:47.160611 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:47 crc kubenswrapper[4806]: I0217 15:22:47.160663 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:47 crc kubenswrapper[4806]: E0217 15:22:47.160738 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:48 crc kubenswrapper[4806]: I0217 15:22:48.161706 4806 scope.go:117] "RemoveContainer" containerID="4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735" Feb 17 15:22:48 crc kubenswrapper[4806]: I0217 15:22:48.890803 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/1.log" Feb 17 15:22:48 crc kubenswrapper[4806]: I0217 15:22:48.891513 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerStarted","Data":"568e6de38434e4eebd27b387fdedb2cd85a0c8630950783ecdb5f0697f6d7faa"} Feb 17 15:22:49 crc kubenswrapper[4806]: I0217 15:22:49.160027 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:49 crc kubenswrapper[4806]: I0217 15:22:49.160049 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:49 crc kubenswrapper[4806]: I0217 15:22:49.160111 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:49 crc kubenswrapper[4806]: E0217 15:22:49.161810 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:49 crc kubenswrapper[4806]: I0217 15:22:49.161866 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:49 crc kubenswrapper[4806]: E0217 15:22:49.162015 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:49 crc kubenswrapper[4806]: E0217 15:22:49.162127 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:49 crc kubenswrapper[4806]: E0217 15:22:49.162284 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:49 crc kubenswrapper[4806]: E0217 15:22:49.276339 4806 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 15:22:51 crc kubenswrapper[4806]: I0217 15:22:51.160943 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:51 crc kubenswrapper[4806]: I0217 15:22:51.161018 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:51 crc kubenswrapper[4806]: I0217 15:22:51.160947 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:51 crc kubenswrapper[4806]: E0217 15:22:51.161154 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:51 crc kubenswrapper[4806]: I0217 15:22:51.161188 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:51 crc kubenswrapper[4806]: E0217 15:22:51.161367 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:51 crc kubenswrapper[4806]: E0217 15:22:51.161511 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:51 crc kubenswrapper[4806]: E0217 15:22:51.161638 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:53 crc kubenswrapper[4806]: I0217 15:22:53.160653 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:53 crc kubenswrapper[4806]: E0217 15:22:53.160868 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h72qm" podUID="5af69f46-757a-4fab-adbd-d7a278868c94" Feb 17 15:22:53 crc kubenswrapper[4806]: I0217 15:22:53.161575 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:53 crc kubenswrapper[4806]: E0217 15:22:53.161709 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 15:22:53 crc kubenswrapper[4806]: I0217 15:22:53.161734 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:53 crc kubenswrapper[4806]: E0217 15:22:53.161847 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 15:22:53 crc kubenswrapper[4806]: I0217 15:22:53.161928 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:53 crc kubenswrapper[4806]: E0217 15:22:53.162175 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.160631 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.160737 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.160861 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.161562 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.162778 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.163560 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.164248 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.164258 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.164356 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 15:22:55 crc kubenswrapper[4806]: I0217 15:22:55.164648 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.821791 4806 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.907794 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8lcn2"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.908508 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.909229 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.909729 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.914262 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.915139 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.916091 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dhdkp"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.916890 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.917774 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5qkvd"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.918936 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.922354 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.922647 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.922825 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.923158 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.923641 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.924623 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.925518 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926016 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926123 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926275 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926559 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h29nb"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926663 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926767 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926891 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.926935 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927120 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927135 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927324 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927337 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927513 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927585 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927740 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927851 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927880 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927908 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.927757 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.928026 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.928047 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.928139 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.928173 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.928318 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.928833 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dp9cm"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.929513 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8tfm"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.929972 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.929535 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.934220 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.934505 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.934591 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.934673 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.934735 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.937629 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.938202 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.938783 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-clfv7"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.939769 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-clfv7" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.941247 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.941846 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lfqq8"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.942263 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4sj79"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.943423 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.962124 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.979035 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.979459 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.979976 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.980491 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r4ldh"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.981092 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.981343 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.981446 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.981610 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.981775 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.981875 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982055 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982101 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982183 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982370 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982448 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982465 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982526 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982565 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982674 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982765 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982828 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982902 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982970 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983061 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983115 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983151 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983260 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983299 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983357 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.982368 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983710 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983822 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983909 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.983989 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984025 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984078 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984162 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984249 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984294 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984332 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984373 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984502 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984574 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984607 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984620 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984700 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984715 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.984777 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.986231 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42n4r"] Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.986286 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.986870 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.987388 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.988473 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.988635 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.988755 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.988822 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.988874 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.988976 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.989236 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993010 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-config\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993096 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnmh\" (UniqueName: \"kubernetes.io/projected/c9d54745-0a0c-436a-8ead-26184660d59c-kube-api-access-zdnmh\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993122 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993158 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-client-ca\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993178 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d54745-0a0c-436a-8ead-26184660d59c-serving-cert\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993474 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993484 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993662 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.993840 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.994433 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.994591 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.994803 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.994986 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.995132 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.995165 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.995314 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.995503 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.995711 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.996007 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.996144 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 15:22:57 crc kubenswrapper[4806]: I0217 15:22:57.996306 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.003071 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.003856 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.003948 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.005425 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.005633 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.006110 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.006528 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.007195 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.007551 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.008695 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c4tb8"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.009287 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.013049 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.013358 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.014284 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.040218 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.048154 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.060970 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.061105 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.061772 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.062155 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.062798 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.062878 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-28rqh"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.063067 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.063514 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.063628 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.064187 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.066355 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.066743 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.066975 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.067018 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.067719 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.069359 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.071749 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.072274 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.072701 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.072745 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nnnv"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.072828 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.072975 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.073337 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.073617 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.074010 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.074118 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-r5477"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.074797 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.075141 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-x5v75"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.075903 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.076358 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.076605 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.077938 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.078183 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gb5xj"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.079197 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.079391 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8lcn2"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.080447 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dhdkp"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.081313 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.082536 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.083680 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.084970 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h29nb"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.086444 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4sj79"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.087075 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-clfv7"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.088369 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.089524 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.090465 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.091428 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8tfm"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.092367 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093747 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093786 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8649291-1472-4f42-b8fe-447fa805d681-config\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093809 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8649291-1472-4f42-b8fe-447fa805d681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093845 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d54745-0a0c-436a-8ead-26184660d59c-serving-cert\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093865 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-image-import-ca\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093883 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-etcd-serving-ca\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093899 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-config\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093916 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8649291-1472-4f42-b8fe-447fa805d681-images\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.093973 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-machine-approver-tls\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094029 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-serving-cert\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094049 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-encryption-config\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094067 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2svx\" (UniqueName: \"kubernetes.io/projected/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-kube-api-access-t2svx\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094082 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-audit\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094126 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094168 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/673f6847-2447-49d3-9e10-5b7ae3363435-audit-dir\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094195 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-etcd-client\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094214 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6j2\" (UniqueName: \"kubernetes.io/projected/c8649291-1472-4f42-b8fe-447fa805d681-kube-api-access-xw6j2\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094238 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-config\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094257 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-client-ca\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094828 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-auth-proxy-config\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.094955 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/673f6847-2447-49d3-9e10-5b7ae3363435-node-pullsecrets\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.095071 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-config\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.095202 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-client-ca\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.095313 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnmh\" (UniqueName: \"kubernetes.io/projected/c9d54745-0a0c-436a-8ead-26184660d59c-kube-api-access-zdnmh\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.095426 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcqt\" (UniqueName: \"kubernetes.io/projected/673f6847-2447-49d3-9e10-5b7ae3363435-kube-api-access-jfcqt\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.095583 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.096349 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.097759 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.101070 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-config\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.102418 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d54745-0a0c-436a-8ead-26184660d59c-serving-cert\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.106365 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.106434 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ds4hl"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.107285 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.113877 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5nlh8"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.113997 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.114492 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.114783 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5nlh8" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.115169 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.116890 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-x5v75"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.118919 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lfqq8"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.121232 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.122891 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.125050 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.126795 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.128004 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.129040 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42n4r"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.130002 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.131983 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5qkvd"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.133723 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.134757 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.135980 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-r5477"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.137046 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.138142 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c4tb8"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.139105 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-28rqh"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.140139 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dp9cm"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.141117 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.142734 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gb5xj"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.143649 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5nlh8"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.144803 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.148744 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nnnv"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.150508 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.152485 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7jn9k"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.153040 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.153871 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7jn9k"] Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.170718 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.190881 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197005 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcqt\" (UniqueName: \"kubernetes.io/projected/673f6847-2447-49d3-9e10-5b7ae3363435-kube-api-access-jfcqt\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197048 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197078 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8649291-1472-4f42-b8fe-447fa805d681-config\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197102 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8649291-1472-4f42-b8fe-447fa805d681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197141 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-image-import-ca\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197166 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-etcd-serving-ca\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197191 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-config\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197215 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8649291-1472-4f42-b8fe-447fa805d681-images\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197239 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-machine-approver-tls\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197269 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-serving-cert\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197292 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-encryption-config\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197316 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2svx\" (UniqueName: \"kubernetes.io/projected/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-kube-api-access-t2svx\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197339 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-audit\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197363 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/673f6847-2447-49d3-9e10-5b7ae3363435-audit-dir\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197384 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-etcd-client\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197428 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6j2\" (UniqueName: \"kubernetes.io/projected/c8649291-1472-4f42-b8fe-447fa805d681-kube-api-access-xw6j2\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197453 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-config\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197479 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-auth-proxy-config\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197501 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/673f6847-2447-49d3-9e10-5b7ae3363435-node-pullsecrets\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197614 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/673f6847-2447-49d3-9e10-5b7ae3363435-node-pullsecrets\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197779 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/673f6847-2447-49d3-9e10-5b7ae3363435-audit-dir\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.197948 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-config\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.198192 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-etcd-serving-ca\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.198249 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8649291-1472-4f42-b8fe-447fa805d681-config\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.198787 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.198831 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c8649291-1472-4f42-b8fe-447fa805d681-images\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.198791 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-audit\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.198922 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-config\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.199232 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/673f6847-2447-49d3-9e10-5b7ae3363435-image-import-ca\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.199547 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-auth-proxy-config\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.200622 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8649291-1472-4f42-b8fe-447fa805d681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.200846 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-machine-approver-tls\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.201463 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-encryption-config\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.202299 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-etcd-client\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.202391 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/673f6847-2447-49d3-9e10-5b7ae3363435-serving-cert\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.210985 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.230116 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.250350 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.270667 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.290570 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.310895 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.331017 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.350393 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.370495 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.390831 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.410282 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.430358 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.451253 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.471196 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.491207 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.512718 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.531274 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.551457 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.571253 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.591659 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.610704 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.631013 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.651228 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.671146 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.712093 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.731496 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.751484 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.771272 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.791833 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.811681 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.831777 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.850839 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.870702 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.910712 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.931076 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.951229 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.972431 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 15:22:58 crc kubenswrapper[4806]: I0217 15:22:58.992065 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.011478 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.032440 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.051513 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.069782 4806 request.go:700] Waited for 1.007671131s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver-operator/secrets?fieldSelector=metadata.name%3Dkube-apiserver-operator-serving-cert&limit=500&resourceVersion=0 Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.071951 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.092151 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.111173 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.131901 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.150895 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.171535 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.191571 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.211624 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.231439 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.251238 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.271279 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.290477 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.310860 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.331964 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.350996 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.371342 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.391620 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.410433 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.441940 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.450997 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.471671 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.491119 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.510453 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.531832 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.550956 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.571873 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.593025 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.611850 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.632474 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.651267 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.671397 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.690744 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.711139 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.732011 4806 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.751555 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.790908 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.804611 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnmh\" (UniqueName: \"kubernetes.io/projected/c9d54745-0a0c-436a-8ead-26184660d59c-kube-api-access-zdnmh\") pod \"controller-manager-879f6c89f-8lcn2\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.812148 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.830983 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.851134 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.870924 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.891365 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.911032 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.930939 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.950787 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 15:22:59 crc kubenswrapper[4806]: I0217 15:22:59.971205 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.016240 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcqt\" (UniqueName: \"kubernetes.io/projected/673f6847-2447-49d3-9e10-5b7ae3363435-kube-api-access-jfcqt\") pod \"apiserver-76f77b778f-5qkvd\" (UID: \"673f6847-2447-49d3-9e10-5b7ae3363435\") " pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.029632 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.045829 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6j2\" (UniqueName: \"kubernetes.io/projected/c8649291-1472-4f42-b8fe-447fa805d681-kube-api-access-xw6j2\") pod \"machine-api-operator-5694c8668f-dhdkp\" (UID: \"c8649291-1472-4f42-b8fe-447fa805d681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.050202 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2svx\" (UniqueName: \"kubernetes.io/projected/435b5b72-d2ad-4d48-94a8-ed1da78cb5c6-kube-api-access-t2svx\") pod \"machine-approver-56656f9798-k8m74\" (UID: \"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.108530 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.118916 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87968279-7e35-4a0a-b1a2-bbdd91ea184d-metrics-tls\") pod \"dns-operator-744455d44c-c4tb8\" (UID: \"87968279-7e35-4a0a-b1a2-bbdd91ea184d\") " pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.118951 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-trusted-ca-bundle\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.118971 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.118989 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9131411b-b7d0-47b5-a4a5-ce289282d5c3-config\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119015 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119032 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119047 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-config\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119060 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8625b175-0be3-4e29-bc64-09452e0f87ca-serving-cert\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119074 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119090 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhcxf\" (UniqueName: \"kubernetes.io/projected/87968279-7e35-4a0a-b1a2-bbdd91ea184d-kube-api-access-lhcxf\") pod \"dns-operator-744455d44c-c4tb8\" (UID: \"87968279-7e35-4a0a-b1a2-bbdd91ea184d\") " pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119106 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-etcd-client\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119121 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcqr4\" (UniqueName: \"kubernetes.io/projected/1e510f3a-7afd-4c62-92a4-e898a6b635fe-kube-api-access-lcqr4\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119149 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8464248c-a865-432e-86e0-d67bd9609645-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119171 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119193 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119207 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-serving-cert\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119222 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-trusted-ca\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119236 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-ca\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119251 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-trusted-ca\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119267 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-bound-sa-token\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119283 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdnb9\" (UniqueName: \"kubernetes.io/projected/954c1de6-6017-448a-addb-5fdc73d0987b-kube-api-access-pdnb9\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119300 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119318 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-client\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119336 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd80c0d6-5372-4ad5-a5e1-45ecad939749-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119360 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32820688-4037-4b80-8a92-9ebe7068d02e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119374 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9131411b-b7d0-47b5-a4a5-ce289282d5c3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119389 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggwwl\" (UniqueName: \"kubernetes.io/projected/58f729fc-8ddb-43b2-897b-23fffea83c73-kube-api-access-ggwwl\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119517 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-config\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119536 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9hhr\" (UniqueName: \"kubernetes.io/projected/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-kube-api-access-s9hhr\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119556 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-audit-policies\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119578 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6nsp\" (UniqueName: \"kubernetes.io/projected/45c1d170-0968-44d2-b9cd-5dcd8732afc3-kube-api-access-w6nsp\") pod \"downloads-7954f5f757-clfv7\" (UID: \"45c1d170-0968-44d2-b9cd-5dcd8732afc3\") " pod="openshift-console/downloads-7954f5f757-clfv7" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119618 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq4dp\" (UniqueName: \"kubernetes.io/projected/8464248c-a865-432e-86e0-d67bd9609645-kube-api-access-jq4dp\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119639 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e510f3a-7afd-4c62-92a4-e898a6b635fe-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119658 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f729fc-8ddb-43b2-897b-23fffea83c73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119691 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-config\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119710 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5chb\" (UniqueName: \"kubernetes.io/projected/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-kube-api-access-v5chb\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119731 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fp4h\" (UniqueName: \"kubernetes.io/projected/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-kube-api-access-6fp4h\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119752 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c335350f-90c3-4c01-9b58-423e540ea120-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119772 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjsfv\" (UniqueName: \"kubernetes.io/projected/cd80c0d6-5372-4ad5-a5e1-45ecad939749-kube-api-access-qjsfv\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119794 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd80c0d6-5372-4ad5-a5e1-45ecad939749-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119816 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-dir\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119836 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119858 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rcrq\" (UniqueName: \"kubernetes.io/projected/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-kube-api-access-4rcrq\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119887 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-d2rfl\" (UID: \"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119909 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c59008-76e7-4196-9ef3-001a9be8bc7d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119945 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4bv4\" (UniqueName: \"kubernetes.io/projected/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-kube-api-access-f4bv4\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119977 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-registry-tls\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.119996 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-registry-certificates\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120016 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/954c1de6-6017-448a-addb-5fdc73d0987b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120036 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-service-ca-bundle\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120057 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f729fc-8ddb-43b2-897b-23fffea83c73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120078 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-metrics-certs\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120096 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9n79\" (UniqueName: \"kubernetes.io/projected/6be3fc8f-849e-4d01-948a-46bc9ca06a05-kube-api-access-g9n79\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120117 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c59008-76e7-4196-9ef3-001a9be8bc7d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120148 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2c2n\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-kube-api-access-c2c2n\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120168 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e510f3a-7afd-4c62-92a4-e898a6b635fe-proxy-tls\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120188 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120207 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120373 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8464248c-a865-432e-86e0-d67bd9609645-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120396 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120436 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c54g\" (UniqueName: \"kubernetes.io/projected/910f725b-6137-4ae8-b86f-9fc53af7d1ce-kube-api-access-6c54g\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120454 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-policies\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120495 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-oauth-config\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120533 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8625b175-0be3-4e29-bc64-09452e0f87ca-config\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120549 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh6vb\" (UniqueName: \"kubernetes.io/projected/c335350f-90c3-4c01-9b58-423e540ea120-kube-api-access-rh6vb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120673 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c59008-76e7-4196-9ef3-001a9be8bc7d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120749 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-serving-cert\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120827 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9131411b-b7d0-47b5-a4a5-ce289282d5c3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.120985 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd80c0d6-5372-4ad5-a5e1-45ecad939749-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121075 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-metrics-tls\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121131 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-encryption-config\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121262 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-audit-dir\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121341 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-client-ca\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121393 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121461 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be3fc8f-849e-4d01-948a-46bc9ca06a05-service-ca-bundle\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121508 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2v4n\" (UniqueName: \"kubernetes.io/projected/10565cb3-8e68-4dd9-9bac-fc770b23825b-kube-api-access-n2v4n\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121594 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/910f725b-6137-4ae8-b86f-9fc53af7d1ce-serving-cert\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121663 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-serving-cert\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.121725 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.122195 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:00.622168596 +0000 UTC m=+142.152799047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.122617 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.122802 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c335350f-90c3-4c01-9b58-423e540ea120-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.122899 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-oauth-serving-cert\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.123081 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-config\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.123257 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-service-ca\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.123930 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.125965 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32820688-4037-4b80-8a92-9ebe7068d02e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.126087 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqfxm\" (UniqueName: \"kubernetes.io/projected/8625b175-0be3-4e29-bc64-09452e0f87ca-kube-api-access-sqfxm\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129113 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-service-ca\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129201 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-serving-cert\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129250 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129294 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-stats-auth\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129340 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8625b175-0be3-4e29-bc64-09452e0f87ca-trusted-ca\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129383 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxj8\" (UniqueName: \"kubernetes.io/projected/2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0-kube-api-access-mbxj8\") pod \"cluster-samples-operator-665b6dd947-d2rfl\" (UID: \"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129480 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954c1de6-6017-448a-addb-5fdc73d0987b-serving-cert\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.129512 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-default-certificate\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.144252 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231464 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231672 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-service-ca\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231716 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32820688-4037-4b80-8a92-9ebe7068d02e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231738 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqfxm\" (UniqueName: \"kubernetes.io/projected/8625b175-0be3-4e29-bc64-09452e0f87ca-kube-api-access-sqfxm\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231761 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-service-ca\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231781 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231806 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-csi-data-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231827 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea7524e-6205-4b23-bec9-028f0ebe3cf2-config-volume\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231848 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94c8\" (UniqueName: \"kubernetes.io/projected/b6086c5b-4528-4e20-b9a8-67b20b450516-kube-api-access-k94c8\") pod \"migrator-59844c95c7-pmw47\" (UID: \"b6086c5b-4528-4e20-b9a8-67b20b450516\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231869 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-serving-cert\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231890 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231911 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pxgw\" (UniqueName: \"kubernetes.io/projected/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-kube-api-access-6pxgw\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231932 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/074f20d5-eaad-4185-88d1-fae34a78e015-webhook-cert\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231967 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954c1de6-6017-448a-addb-5fdc73d0987b-serving-cert\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.231987 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-default-certificate\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232007 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-stats-auth\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232026 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8625b175-0be3-4e29-bc64-09452e0f87ca-trusted-ca\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232047 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxj8\" (UniqueName: \"kubernetes.io/projected/2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0-kube-api-access-mbxj8\") pod \"cluster-samples-operator-665b6dd947-d2rfl\" (UID: \"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232070 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcg28\" (UniqueName: \"kubernetes.io/projected/76315151-b675-410e-9ed9-8e39ebd883b3-kube-api-access-kcg28\") pod \"package-server-manager-789f6589d5-tbmdq\" (UID: \"76315151-b675-410e-9ed9-8e39ebd883b3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232092 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87968279-7e35-4a0a-b1a2-bbdd91ea184d-metrics-tls\") pod \"dns-operator-744455d44c-c4tb8\" (UID: \"87968279-7e35-4a0a-b1a2-bbdd91ea184d\") " pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232116 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-socket-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232136 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-certs\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232157 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-trusted-ca-bundle\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232178 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15adb8fa-833b-4335-9788-50d5bb34e14d-signing-key\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232199 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6rff\" (UniqueName: \"kubernetes.io/projected/4828e844-f021-4591-ab25-ca198d3e577b-kube-api-access-v6rff\") pod \"multus-admission-controller-857f4d67dd-28rqh\" (UID: \"4828e844-f021-4591-ab25-ca198d3e577b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232224 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232246 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9131411b-b7d0-47b5-a4a5-ce289282d5c3-config\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232266 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232288 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232309 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-config\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232333 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25b773c8-e4fa-4b3c-ab59-74105f1296af-metrics-tls\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232354 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b952n\" (UniqueName: \"kubernetes.io/projected/4a3ff543-139c-48f8-a201-103c00c8b23e-kube-api-access-b952n\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232375 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfmz2\" (UniqueName: \"kubernetes.io/projected/4efc9c9c-8be8-41de-b524-dfb7dc45c3d0-kube-api-access-qfmz2\") pod \"control-plane-machine-set-operator-78cbb6b69f-w6lrr\" (UID: \"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232397 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8625b175-0be3-4e29-bc64-09452e0f87ca-serving-cert\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232450 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232472 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhcxf\" (UniqueName: \"kubernetes.io/projected/87968279-7e35-4a0a-b1a2-bbdd91ea184d-kube-api-access-lhcxf\") pod \"dns-operator-744455d44c-c4tb8\" (UID: \"87968279-7e35-4a0a-b1a2-bbdd91ea184d\") " pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232491 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-etcd-client\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232514 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zc9z\" (UniqueName: \"kubernetes.io/projected/77e696b4-bfbb-4600-8f7a-91772f7e8322-kube-api-access-4zc9z\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232533 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/074f20d5-eaad-4185-88d1-fae34a78e015-tmpfs\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232555 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232579 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcqr4\" (UniqueName: \"kubernetes.io/projected/1e510f3a-7afd-4c62-92a4-e898a6b635fe-kube-api-access-lcqr4\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232598 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15adb8fa-833b-4335-9788-50d5bb34e14d-signing-cabundle\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232620 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gfxc\" (UniqueName: \"kubernetes.io/projected/d4c76f4b-80c1-409a-acba-39a9edf0c975-kube-api-access-4gfxc\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232654 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8464248c-a865-432e-86e0-d67bd9609645-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232677 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232711 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232734 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-serving-cert\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232753 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-registration-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232774 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-trusted-ca\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232795 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-ca\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232815 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-trusted-ca\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232833 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jw26\" (UniqueName: \"kubernetes.io/projected/18fcb65b-e08a-4c4b-b8c3-d474117395b5-kube-api-access-5jw26\") pod \"ingress-canary-7jn9k\" (UID: \"18fcb65b-e08a-4c4b-b8c3-d474117395b5\") " pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232865 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-bound-sa-token\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232883 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4a3ff543-139c-48f8-a201-103c00c8b23e-profile-collector-cert\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232905 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdnb9\" (UniqueName: \"kubernetes.io/projected/954c1de6-6017-448a-addb-5fdc73d0987b-kube-api-access-pdnb9\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232928 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd80c0d6-5372-4ad5-a5e1-45ecad939749-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232950 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.232972 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-client\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233002 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32820688-4037-4b80-8a92-9ebe7068d02e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233033 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9131411b-b7d0-47b5-a4a5-ce289282d5c3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233056 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggwwl\" (UniqueName: \"kubernetes.io/projected/58f729fc-8ddb-43b2-897b-23fffea83c73-kube-api-access-ggwwl\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233078 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-config\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233102 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9hhr\" (UniqueName: \"kubernetes.io/projected/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-kube-api-access-s9hhr\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233123 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-audit-policies\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233145 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4828e844-f021-4591-ab25-ca198d3e577b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-28rqh\" (UID: \"4828e844-f021-4591-ab25-ca198d3e577b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233165 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-mountpoint-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233189 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4dp\" (UniqueName: \"kubernetes.io/projected/8464248c-a865-432e-86e0-d67bd9609645-kube-api-access-jq4dp\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233212 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6nsp\" (UniqueName: \"kubernetes.io/projected/45c1d170-0968-44d2-b9cd-5dcd8732afc3-kube-api-access-w6nsp\") pod \"downloads-7954f5f757-clfv7\" (UID: \"45c1d170-0968-44d2-b9cd-5dcd8732afc3\") " pod="openshift-console/downloads-7954f5f757-clfv7" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233235 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46737ae1-a5eb-453f-aa74-2af76d30d7c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233259 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e510f3a-7afd-4c62-92a4-e898a6b635fe-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233291 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f729fc-8ddb-43b2-897b-23fffea83c73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233313 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-config\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233334 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-plugins-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233353 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea7524e-6205-4b23-bec9-028f0ebe3cf2-secret-volume\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.233374 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6shw\" (UniqueName: \"kubernetes.io/projected/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-kube-api-access-t6shw\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.233834 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:00.7338086 +0000 UTC m=+142.264439011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234044 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5chb\" (UniqueName: \"kubernetes.io/projected/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-kube-api-access-v5chb\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234110 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fp4h\" (UniqueName: \"kubernetes.io/projected/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-kube-api-access-6fp4h\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234136 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshfn\" (UniqueName: \"kubernetes.io/projected/25b773c8-e4fa-4b3c-ab59-74105f1296af-kube-api-access-rshfn\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234161 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c335350f-90c3-4c01-9b58-423e540ea120-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234194 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjsfv\" (UniqueName: \"kubernetes.io/projected/cd80c0d6-5372-4ad5-a5e1-45ecad939749-kube-api-access-qjsfv\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234221 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25b773c8-e4fa-4b3c-ab59-74105f1296af-config-volume\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234242 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/074f20d5-eaad-4185-88d1-fae34a78e015-apiservice-cert\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234266 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd80c0d6-5372-4ad5-a5e1-45ecad939749-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234295 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-dir\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234320 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.234372 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-d2rfl\" (UID: \"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235639 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rcrq\" (UniqueName: \"kubernetes.io/projected/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-kube-api-access-4rcrq\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235682 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwtll\" (UniqueName: \"kubernetes.io/projected/defc246b-6f58-4f10-82c1-a07c2ef017ca-kube-api-access-dwtll\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235724 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-registry-tls\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235744 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-registry-certificates\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235767 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/954c1de6-6017-448a-addb-5fdc73d0987b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235787 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c59008-76e7-4196-9ef3-001a9be8bc7d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235805 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4bv4\" (UniqueName: \"kubernetes.io/projected/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-kube-api-access-f4bv4\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235829 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f729fc-8ddb-43b2-897b-23fffea83c73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235848 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-service-ca-bundle\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235868 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzr9\" (UniqueName: \"kubernetes.io/projected/074f20d5-eaad-4185-88d1-fae34a78e015-kube-api-access-phzr9\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235897 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vw4w\" (UniqueName: \"kubernetes.io/projected/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-kube-api-access-4vw4w\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235918 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235942 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2c2n\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-kube-api-access-c2c2n\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235962 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-metrics-certs\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.235983 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9n79\" (UniqueName: \"kubernetes.io/projected/6be3fc8f-849e-4d01-948a-46bc9ca06a05-kube-api-access-g9n79\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236004 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c59008-76e7-4196-9ef3-001a9be8bc7d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236028 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46737ae1-a5eb-453f-aa74-2af76d30d7c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236051 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e510f3a-7afd-4c62-92a4-e898a6b635fe-proxy-tls\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236074 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-node-bootstrap-token\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236098 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236122 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236137 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-service-ca\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.236143 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqzp\" (UniqueName: \"kubernetes.io/projected/15adb8fa-833b-4335-9788-50d5bb34e14d-kube-api-access-prqzp\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.242950 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defc246b-6f58-4f10-82c1-a07c2ef017ca-config\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243005 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77e696b4-bfbb-4600-8f7a-91772f7e8322-images\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243029 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-srv-cert\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243072 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8464248c-a865-432e-86e0-d67bd9609645-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243099 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243132 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c54g\" (UniqueName: \"kubernetes.io/projected/910f725b-6137-4ae8-b86f-9fc53af7d1ce-kube-api-access-6c54g\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243158 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-policies\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243184 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77e696b4-bfbb-4600-8f7a-91772f7e8322-proxy-tls\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243206 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-oauth-config\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243232 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18fcb65b-e08a-4c4b-b8c3-d474117395b5-cert\") pod \"ingress-canary-7jn9k\" (UID: \"18fcb65b-e08a-4c4b-b8c3-d474117395b5\") " pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243275 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8625b175-0be3-4e29-bc64-09452e0f87ca-config\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243299 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh6vb\" (UniqueName: \"kubernetes.io/projected/c335350f-90c3-4c01-9b58-423e540ea120-kube-api-access-rh6vb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243325 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243350 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c59008-76e7-4196-9ef3-001a9be8bc7d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243372 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2lb\" (UniqueName: \"kubernetes.io/projected/bea7524e-6205-4b23-bec9-028f0ebe3cf2-kube-api-access-zd2lb\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243417 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9131411b-b7d0-47b5-a4a5-ce289282d5c3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243444 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-serving-cert\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243467 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243474 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd80c0d6-5372-4ad5-a5e1-45ecad939749-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243629 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-metrics-tls\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243696 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defc246b-6f58-4f10-82c1-a07c2ef017ca-serving-cert\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.243723 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/76315151-b675-410e-9ed9-8e39ebd883b3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tbmdq\" (UID: \"76315151-b675-410e-9ed9-8e39ebd883b3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244282 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4efc9c9c-8be8-41de-b524-dfb7dc45c3d0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w6lrr\" (UID: \"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244328 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-encryption-config\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244352 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-audit-dir\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244374 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77e696b4-bfbb-4600-8f7a-91772f7e8322-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244418 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be3fc8f-849e-4d01-948a-46bc9ca06a05-service-ca-bundle\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244449 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2v4n\" (UniqueName: \"kubernetes.io/projected/10565cb3-8e68-4dd9-9bac-fc770b23825b-kube-api-access-n2v4n\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244472 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-client-ca\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244486 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-config\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244497 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244567 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/910f725b-6137-4ae8-b86f-9fc53af7d1ce-serving-cert\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244648 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-serving-cert\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244716 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244786 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-metrics-certs\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244817 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244874 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4a3ff543-139c-48f8-a201-103c00c8b23e-srv-cert\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244907 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46737ae1-a5eb-453f-aa74-2af76d30d7c3-config\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244947 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-config\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244965 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c335350f-90c3-4c01-9b58-423e540ea120-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.244994 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.245001 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-oauth-serving-cert\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.245050 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-config\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.245576 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-audit-dir\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.245655 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-trusted-ca-bundle\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.246980 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8464248c-a865-432e-86e0-d67bd9609645-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.247298 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:00.747282715 +0000 UTC m=+142.277913146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.247621 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-trusted-ca\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.248483 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.249153 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-policies\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.250077 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-serving-cert\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.251987 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-config\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.252348 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-oauth-serving-cert\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.252545 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-dir\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.252669 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-serving-cert\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.253295 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-encryption-config\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.253393 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87968279-7e35-4a0a-b1a2-bbdd91ea184d-metrics-tls\") pod \"dns-operator-744455d44c-c4tb8\" (UID: \"87968279-7e35-4a0a-b1a2-bbdd91ea184d\") " pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.253574 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8625b175-0be3-4e29-bc64-09452e0f87ca-config\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.255700 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.255762 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd80c0d6-5372-4ad5-a5e1-45ecad939749-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.256006 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be3fc8f-849e-4d01-948a-46bc9ca06a05-service-ca-bundle\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.256619 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e510f3a-7afd-4c62-92a4-e898a6b635fe-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.256882 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.257142 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd80c0d6-5372-4ad5-a5e1-45ecad939749-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.257223 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-config\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.258005 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c59008-76e7-4196-9ef3-001a9be8bc7d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.258440 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-client-ca\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.258564 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c335350f-90c3-4c01-9b58-423e540ea120-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.260225 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-client\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.260684 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8625b175-0be3-4e29-bc64-09452e0f87ca-serving-cert\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.260893 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/910f725b-6137-4ae8-b86f-9fc53af7d1ce-service-ca-bundle\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.261337 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-ca\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.261506 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f729fc-8ddb-43b2-897b-23fffea83c73-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.261746 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32820688-4037-4b80-8a92-9ebe7068d02e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.261876 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-d2rfl\" (UID: \"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.262014 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-registry-tls\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.262635 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32820688-4037-4b80-8a92-9ebe7068d02e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.262645 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-etcd-service-ca\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.262982 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-registry-certificates\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.263303 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-metrics-tls\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.263723 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0c59008-76e7-4196-9ef3-001a9be8bc7d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.263964 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-audit-policies\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.264418 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.262485 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-serving-cert\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.264555 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c335350f-90c3-4c01-9b58-423e540ea120-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.265131 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.265181 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-oauth-config\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.265330 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.265339 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-etcd-client\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.265598 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/910f725b-6137-4ae8-b86f-9fc53af7d1ce-serving-cert\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.265757 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9131411b-b7d0-47b5-a4a5-ce289282d5c3-config\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.265763 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f729fc-8ddb-43b2-897b-23fffea83c73-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.266076 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-trusted-ca\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.267863 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.268221 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.268494 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/954c1de6-6017-448a-addb-5fdc73d0987b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.268577 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8625b175-0be3-4e29-bc64-09452e0f87ca-trusted-ca\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.268720 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.268759 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.268923 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-default-certificate\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.269052 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8464248c-a865-432e-86e0-d67bd9609645-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.269381 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-console-serving-cert\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.269908 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e510f3a-7afd-4c62-92a4-e898a6b635fe-proxy-tls\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.271832 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.276909 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9131411b-b7d0-47b5-a4a5-ce289282d5c3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.278232 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6be3fc8f-849e-4d01-948a-46bc9ca06a05-stats-auth\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.278573 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rcrq\" (UniqueName: \"kubernetes.io/projected/7b3651e3-2763-4f4b-a953-3ec65b52a8a7-kube-api-access-4rcrq\") pod \"apiserver-7bbb656c7d-2hntk\" (UID: \"7b3651e3-2763-4f4b-a953-3ec65b52a8a7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.278779 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954c1de6-6017-448a-addb-5fdc73d0987b-serving-cert\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.279515 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.286933 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqfxm\" (UniqueName: \"kubernetes.io/projected/8625b175-0be3-4e29-bc64-09452e0f87ca-kube-api-access-sqfxm\") pod \"console-operator-58897d9998-h29nb\" (UID: \"8625b175-0be3-4e29-bc64-09452e0f87ca\") " pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.302357 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8lcn2"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.306664 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2c2n\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-kube-api-access-c2c2n\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.326736 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0c59008-76e7-4196-9ef3-001a9be8bc7d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-czntk\" (UID: \"c0c59008-76e7-4196-9ef3-001a9be8bc7d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.345084 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.345108 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.345866 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346075 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25b773c8-e4fa-4b3c-ab59-74105f1296af-config-volume\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346101 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/074f20d5-eaad-4185-88d1-fae34a78e015-apiservice-cert\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.346150 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:00.84612769 +0000 UTC m=+142.376758101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346229 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwtll\" (UniqueName: \"kubernetes.io/projected/defc246b-6f58-4f10-82c1-a07c2ef017ca-kube-api-access-dwtll\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346267 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzr9\" (UniqueName: \"kubernetes.io/projected/074f20d5-eaad-4185-88d1-fae34a78e015-kube-api-access-phzr9\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346285 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vw4w\" (UniqueName: \"kubernetes.io/projected/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-kube-api-access-4vw4w\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346304 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346330 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46737ae1-a5eb-453f-aa74-2af76d30d7c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346360 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-node-bootstrap-token\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346379 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqzp\" (UniqueName: \"kubernetes.io/projected/15adb8fa-833b-4335-9788-50d5bb34e14d-kube-api-access-prqzp\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346416 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defc246b-6f58-4f10-82c1-a07c2ef017ca-config\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346431 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77e696b4-bfbb-4600-8f7a-91772f7e8322-images\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346445 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-srv-cert\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346468 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77e696b4-bfbb-4600-8f7a-91772f7e8322-proxy-tls\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346485 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18fcb65b-e08a-4c4b-b8c3-d474117395b5-cert\") pod \"ingress-canary-7jn9k\" (UID: \"18fcb65b-e08a-4c4b-b8c3-d474117395b5\") " pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346508 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346529 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2lb\" (UniqueName: \"kubernetes.io/projected/bea7524e-6205-4b23-bec9-028f0ebe3cf2-kube-api-access-zd2lb\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346550 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defc246b-6f58-4f10-82c1-a07c2ef017ca-serving-cert\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346570 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/76315151-b675-410e-9ed9-8e39ebd883b3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tbmdq\" (UID: \"76315151-b675-410e-9ed9-8e39ebd883b3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346589 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4efc9c9c-8be8-41de-b524-dfb7dc45c3d0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w6lrr\" (UID: \"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346607 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77e696b4-bfbb-4600-8f7a-91772f7e8322-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346637 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346656 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4a3ff543-139c-48f8-a201-103c00c8b23e-srv-cert\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346693 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46737ae1-a5eb-453f-aa74-2af76d30d7c3-config\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346713 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-csi-data-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346728 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea7524e-6205-4b23-bec9-028f0ebe3cf2-config-volume\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346745 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94c8\" (UniqueName: \"kubernetes.io/projected/b6086c5b-4528-4e20-b9a8-67b20b450516-kube-api-access-k94c8\") pod \"migrator-59844c95c7-pmw47\" (UID: \"b6086c5b-4528-4e20-b9a8-67b20b450516\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346760 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pxgw\" (UniqueName: \"kubernetes.io/projected/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-kube-api-access-6pxgw\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346776 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/074f20d5-eaad-4185-88d1-fae34a78e015-webhook-cert\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346798 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcg28\" (UniqueName: \"kubernetes.io/projected/76315151-b675-410e-9ed9-8e39ebd883b3-kube-api-access-kcg28\") pod \"package-server-manager-789f6589d5-tbmdq\" (UID: \"76315151-b675-410e-9ed9-8e39ebd883b3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346814 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-socket-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346830 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-certs\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346845 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15adb8fa-833b-4335-9788-50d5bb34e14d-signing-key\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346860 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6rff\" (UniqueName: \"kubernetes.io/projected/4828e844-f021-4591-ab25-ca198d3e577b-kube-api-access-v6rff\") pod \"multus-admission-controller-857f4d67dd-28rqh\" (UID: \"4828e844-f021-4591-ab25-ca198d3e577b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346880 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25b773c8-e4fa-4b3c-ab59-74105f1296af-metrics-tls\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346894 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b952n\" (UniqueName: \"kubernetes.io/projected/4a3ff543-139c-48f8-a201-103c00c8b23e-kube-api-access-b952n\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346918 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfmz2\" (UniqueName: \"kubernetes.io/projected/4efc9c9c-8be8-41de-b524-dfb7dc45c3d0-kube-api-access-qfmz2\") pod \"control-plane-machine-set-operator-78cbb6b69f-w6lrr\" (UID: \"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346940 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zc9z\" (UniqueName: \"kubernetes.io/projected/77e696b4-bfbb-4600-8f7a-91772f7e8322-kube-api-access-4zc9z\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346956 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/074f20d5-eaad-4185-88d1-fae34a78e015-tmpfs\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346971 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.346991 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15adb8fa-833b-4335-9788-50d5bb34e14d-signing-cabundle\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347008 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gfxc\" (UniqueName: \"kubernetes.io/projected/d4c76f4b-80c1-409a-acba-39a9edf0c975-kube-api-access-4gfxc\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347038 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-registration-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347055 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jw26\" (UniqueName: \"kubernetes.io/projected/18fcb65b-e08a-4c4b-b8c3-d474117395b5-kube-api-access-5jw26\") pod \"ingress-canary-7jn9k\" (UID: \"18fcb65b-e08a-4c4b-b8c3-d474117395b5\") " pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347074 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4a3ff543-139c-48f8-a201-103c00c8b23e-profile-collector-cert\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347121 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4828e844-f021-4591-ab25-ca198d3e577b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-28rqh\" (UID: \"4828e844-f021-4591-ab25-ca198d3e577b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347137 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-mountpoint-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347161 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46737ae1-a5eb-453f-aa74-2af76d30d7c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347177 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-plugins-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347191 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea7524e-6205-4b23-bec9-028f0ebe3cf2-secret-volume\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347210 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6shw\" (UniqueName: \"kubernetes.io/projected/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-kube-api-access-t6shw\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347237 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rshfn\" (UniqueName: \"kubernetes.io/projected/25b773c8-e4fa-4b3c-ab59-74105f1296af-kube-api-access-rshfn\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.347499 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25b773c8-e4fa-4b3c-ab59-74105f1296af-config-volume\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.348203 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-socket-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.352941 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-certs\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.353039 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4bv4\" (UniqueName: \"kubernetes.io/projected/0e9e9afe-0548-44f0-a905-4ba3c9aa16af-kube-api-access-f4bv4\") pod \"etcd-operator-b45778765-42n4r\" (UID: \"0e9e9afe-0548-44f0-a905-4ba3c9aa16af\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.353217 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/77e696b4-bfbb-4600-8f7a-91772f7e8322-images\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.354257 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defc246b-6f58-4f10-82c1-a07c2ef017ca-config\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.354338 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/074f20d5-eaad-4185-88d1-fae34a78e015-webhook-cert\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.354910 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.355395 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46737ae1-a5eb-453f-aa74-2af76d30d7c3-config\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.355696 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/074f20d5-eaad-4185-88d1-fae34a78e015-apiservice-cert\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.356513 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77e696b4-bfbb-4600-8f7a-91772f7e8322-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.356561 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-csi-data-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.356803 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:00.856785587 +0000 UTC m=+142.387415998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.357687 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea7524e-6205-4b23-bec9-028f0ebe3cf2-config-volume\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.357780 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-registration-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.357815 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-mountpoint-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.358367 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/074f20d5-eaad-4185-88d1-fae34a78e015-tmpfs\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.359221 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4efc9c9c-8be8-41de-b524-dfb7dc45c3d0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w6lrr\" (UID: \"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.359382 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-plugins-dir\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.360193 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/76315151-b675-410e-9ed9-8e39ebd883b3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tbmdq\" (UID: \"76315151-b675-410e-9ed9-8e39ebd883b3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.361035 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15adb8fa-833b-4335-9788-50d5bb34e14d-signing-cabundle\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.362865 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-dhdkp"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.363336 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.363690 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defc246b-6f58-4f10-82c1-a07c2ef017ca-serving-cert\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.363844 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15adb8fa-833b-4335-9788-50d5bb34e14d-signing-key\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.366378 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-node-bootstrap-token\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.367577 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4a3ff543-139c-48f8-a201-103c00c8b23e-srv-cert\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.368213 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77e696b4-bfbb-4600-8f7a-91772f7e8322-proxy-tls\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.368432 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-srv-cert\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.368728 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.370796 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18fcb65b-e08a-4c4b-b8c3-d474117395b5-cert\") pod \"ingress-canary-7jn9k\" (UID: \"18fcb65b-e08a-4c4b-b8c3-d474117395b5\") " pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.372419 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4a3ff543-139c-48f8-a201-103c00c8b23e-profile-collector-cert\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.372802 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46737ae1-a5eb-453f-aa74-2af76d30d7c3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.372805 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4828e844-f021-4591-ab25-ca198d3e577b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-28rqh\" (UID: \"4828e844-f021-4591-ab25-ca198d3e577b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.376273 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9n79\" (UniqueName: \"kubernetes.io/projected/6be3fc8f-849e-4d01-948a-46bc9ca06a05-kube-api-access-g9n79\") pod \"router-default-5444994796-r4ldh\" (UID: \"6be3fc8f-849e-4d01-948a-46bc9ca06a05\") " pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.379088 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25b773c8-e4fa-4b3c-ab59-74105f1296af-metrics-tls\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.381598 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea7524e-6205-4b23-bec9-028f0ebe3cf2-secret-volume\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.383587 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9131411b-b7d0-47b5-a4a5-ce289282d5c3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lz4lh\" (UID: \"9131411b-b7d0-47b5-a4a5-ce289282d5c3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: W0217 15:23:00.387752 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8649291_1472_4f42_b8fe_447fa805d681.slice/crio-76ba0fa81315312318e28c76af86b4c1bef2d879735eca1950b9771ccda94195 WatchSource:0}: Error finding container 76ba0fa81315312318e28c76af86b4c1bef2d879735eca1950b9771ccda94195: Status 404 returned error can't find the container with id 76ba0fa81315312318e28c76af86b4c1bef2d879735eca1950b9771ccda94195 Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.413418 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5qkvd"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.418769 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.429891 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.447983 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.448481 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:00.948449019 +0000 UTC m=+142.479079480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.449141 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcqr4\" (UniqueName: \"kubernetes.io/projected/1e510f3a-7afd-4c62-92a4-e898a6b635fe-kube-api-access-lcqr4\") pod \"machine-config-controller-84d6567774-hg9q5\" (UID: \"1e510f3a-7afd-4c62-92a4-e898a6b635fe\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.469674 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhcxf\" (UniqueName: \"kubernetes.io/projected/87968279-7e35-4a0a-b1a2-bbdd91ea184d-kube-api-access-lhcxf\") pod \"dns-operator-744455d44c-c4tb8\" (UID: \"87968279-7e35-4a0a-b1a2-bbdd91ea184d\") " pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.473836 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.487644 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq4dp\" (UniqueName: \"kubernetes.io/projected/8464248c-a865-432e-86e0-d67bd9609645-kube-api-access-jq4dp\") pod \"kube-storage-version-migrator-operator-b67b599dd-tjb2d\" (UID: \"8464248c-a865-432e-86e0-d67bd9609645\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.510918 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggwwl\" (UniqueName: \"kubernetes.io/projected/58f729fc-8ddb-43b2-897b-23fffea83c73-kube-api-access-ggwwl\") pod \"openshift-apiserver-operator-796bbdcf4f-6mpm9\" (UID: \"58f729fc-8ddb-43b2-897b-23fffea83c73\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.514157 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.552184 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.552827 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.052810647 +0000 UTC m=+142.583441058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.554397 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5chb\" (UniqueName: \"kubernetes.io/projected/ac7c661c-cf5d-418e-89d3-bc516cabd0e6-kube-api-access-v5chb\") pod \"console-f9d7485db-4sj79\" (UID: \"ac7c661c-cf5d-418e-89d3-bc516cabd0e6\") " pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.558017 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.559082 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fp4h\" (UniqueName: \"kubernetes.io/projected/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-kube-api-access-6fp4h\") pod \"route-controller-manager-6576b87f9c-rbxx6\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.579205 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.579983 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdnb9\" (UniqueName: \"kubernetes.io/projected/954c1de6-6017-448a-addb-5fdc73d0987b-kube-api-access-pdnb9\") pod \"openshift-config-operator-7777fb866f-dsdqr\" (UID: \"954c1de6-6017-448a-addb-5fdc73d0987b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.592912 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.592967 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd80c0d6-5372-4ad5-a5e1-45ecad939749-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.601708 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.615163 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c54g\" (UniqueName: \"kubernetes.io/projected/910f725b-6137-4ae8-b86f-9fc53af7d1ce-kube-api-access-6c54g\") pod \"authentication-operator-69f744f599-lfqq8\" (UID: \"910f725b-6137-4ae8-b86f-9fc53af7d1ce\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.625948 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.628979 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.636672 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.638020 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjsfv\" (UniqueName: \"kubernetes.io/projected/cd80c0d6-5372-4ad5-a5e1-45ecad939749-kube-api-access-qjsfv\") pod \"cluster-image-registry-operator-dc59b4c8b-kkbn8\" (UID: \"cd80c0d6-5372-4ad5-a5e1-45ecad939749\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.646145 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h29nb"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.650514 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.653307 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.653788 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.153766603 +0000 UTC m=+142.684397024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.653903 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.654356 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.154346677 +0000 UTC m=+142.684977088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.658001 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-bound-sa-token\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.658265 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.673810 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6nsp\" (UniqueName: \"kubernetes.io/projected/45c1d170-0968-44d2-b9cd-5dcd8732afc3-kube-api-access-w6nsp\") pod \"downloads-7954f5f757-clfv7\" (UID: \"45c1d170-0968-44d2-b9cd-5dcd8732afc3\") " pod="openshift-console/downloads-7954f5f757-clfv7" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.686751 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.697309 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2v4n\" (UniqueName: \"kubernetes.io/projected/10565cb3-8e68-4dd9-9bac-fc770b23825b-kube-api-access-n2v4n\") pod \"oauth-openshift-558db77b4-dp9cm\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.720469 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxj8\" (UniqueName: \"kubernetes.io/projected/2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0-kube-api-access-mbxj8\") pod \"cluster-samples-operator-665b6dd947-d2rfl\" (UID: \"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.729158 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh6vb\" (UniqueName: \"kubernetes.io/projected/c335350f-90c3-4c01-9b58-423e540ea120-kube-api-access-rh6vb\") pod \"openshift-controller-manager-operator-756b6f6bc6-pht7c\" (UID: \"c335350f-90c3-4c01-9b58-423e540ea120\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.746347 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.752325 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9hhr\" (UniqueName: \"kubernetes.io/projected/f8c7556e-9967-49d8-aa86-a82a9a6eb29a-kube-api-access-s9hhr\") pod \"ingress-operator-5b745b69d9-zmp4t\" (UID: \"f8c7556e-9967-49d8-aa86-a82a9a6eb29a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.755588 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.755979 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.255958278 +0000 UTC m=+142.786588689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: W0217 15:23:00.774266 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b3651e3_2763_4f4b_a953_3ec65b52a8a7.slice/crio-e07a34c91df63076ee58dc323022d9e76262ce6d79c6aae0187d83011fe89465 WatchSource:0}: Error finding container e07a34c91df63076ee58dc323022d9e76262ce6d79c6aae0187d83011fe89465: Status 404 returned error can't find the container with id e07a34c91df63076ee58dc323022d9e76262ce6d79c6aae0187d83011fe89465 Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.778565 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rshfn\" (UniqueName: \"kubernetes.io/projected/25b773c8-e4fa-4b3c-ab59-74105f1296af-kube-api-access-rshfn\") pod \"dns-default-5nlh8\" (UID: \"25b773c8-e4fa-4b3c-ab59-74105f1296af\") " pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.780159 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.786240 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwtll\" (UniqueName: \"kubernetes.io/projected/defc246b-6f58-4f10-82c1-a07c2ef017ca-kube-api-access-dwtll\") pod \"service-ca-operator-777779d784-r5477\" (UID: \"defc246b-6f58-4f10-82c1-a07c2ef017ca\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.814656 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzr9\" (UniqueName: \"kubernetes.io/projected/074f20d5-eaad-4185-88d1-fae34a78e015-kube-api-access-phzr9\") pod \"packageserver-d55dfcdfc-bxsmr\" (UID: \"074f20d5-eaad-4185-88d1-fae34a78e015\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.823734 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.827021 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vw4w\" (UniqueName: \"kubernetes.io/projected/5dce6f5c-d2cb-4d42-9635-8dc6d4f10481-kube-api-access-4vw4w\") pod \"machine-config-server-ds4hl\" (UID: \"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481\") " pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.827441 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.836752 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.846741 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcg28\" (UniqueName: \"kubernetes.io/projected/76315151-b675-410e-9ed9-8e39ebd883b3-kube-api-access-kcg28\") pod \"package-server-manager-789f6589d5-tbmdq\" (UID: \"76315151-b675-410e-9ed9-8e39ebd883b3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.851070 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.856620 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.857249 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.357234982 +0000 UTC m=+142.887865393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.865270 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-clfv7" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.870863 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2lb\" (UniqueName: \"kubernetes.io/projected/bea7524e-6205-4b23-bec9-028f0ebe3cf2-kube-api-access-zd2lb\") pod \"collect-profiles-29522355-mn87x\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.873978 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.887573 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6rff\" (UniqueName: \"kubernetes.io/projected/4828e844-f021-4591-ab25-ca198d3e577b-kube-api-access-v6rff\") pod \"multus-admission-controller-857f4d67dd-28rqh\" (UID: \"4828e844-f021-4591-ab25-ca198d3e577b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.887713 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.917629 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/46737ae1-a5eb-453f-aa74-2af76d30d7c3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-r7shr\" (UID: \"46737ae1-a5eb-453f-aa74-2af76d30d7c3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.929209 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr"] Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.933574 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqzp\" (UniqueName: \"kubernetes.io/projected/15adb8fa-833b-4335-9788-50d5bb34e14d-kube-api-access-prqzp\") pod \"service-ca-9c57cc56f-x5v75\" (UID: \"15adb8fa-833b-4335-9788-50d5bb34e14d\") " pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:00 crc kubenswrapper[4806]: W0217 15:23:00.936524 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9131411b_b7d0_47b5_a4a5_ce289282d5c3.slice/crio-f2da644df7e853e5e62146db1cab6085cdd1f8bb6cfe11ad5086144972038662 WatchSource:0}: Error finding container f2da644df7e853e5e62146db1cab6085cdd1f8bb6cfe11ad5086144972038662: Status 404 returned error can't find the container with id f2da644df7e853e5e62146db1cab6085cdd1f8bb6cfe11ad5086144972038662 Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.948983 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pxgw\" (UniqueName: \"kubernetes.io/projected/42513bc4-7e15-4e8f-b1aa-006b42a10ff4-kube-api-access-6pxgw\") pod \"csi-hostpathplugin-gb5xj\" (UID: \"42513bc4-7e15-4e8f-b1aa-006b42a10ff4\") " pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.956638 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" event={"ID":"c0c59008-76e7-4196-9ef3-001a9be8bc7d","Type":"ContainerStarted","Data":"a2b4e35e3fa888390b0b379f8deae6c384206ba6a2750164a259d2c1cf2b0912"} Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.957472 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:00 crc kubenswrapper[4806]: E0217 15:23:00.957784 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.457770427 +0000 UTC m=+142.988400838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.965985 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b952n\" (UniqueName: \"kubernetes.io/projected/4a3ff543-139c-48f8-a201-103c00c8b23e-kube-api-access-b952n\") pod \"catalog-operator-68c6474976-52z2c\" (UID: \"4a3ff543-139c-48f8-a201-103c00c8b23e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.972699 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.990945 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" Feb 17 15:23:00 crc kubenswrapper[4806]: I0217 15:23:00.991770 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jw26\" (UniqueName: \"kubernetes.io/projected/18fcb65b-e08a-4c4b-b8c3-d474117395b5-kube-api-access-5jw26\") pod \"ingress-canary-7jn9k\" (UID: \"18fcb65b-e08a-4c4b-b8c3-d474117395b5\") " pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.008088 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" event={"ID":"c9d54745-0a0c-436a-8ead-26184660d59c","Type":"ContainerStarted","Data":"3a16e68c2cdfe7061b0f82fde54eaba8f4f59db28cb1cff4f3ad065df0f65a11"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.008133 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" event={"ID":"c9d54745-0a0c-436a-8ead-26184660d59c","Type":"ContainerStarted","Data":"6c56167c2bca3531e42aa4b24d3bd26e6a22f5cc5b64fdce4da10a79835df054"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.008254 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfmz2\" (UniqueName: \"kubernetes.io/projected/4efc9c9c-8be8-41de-b524-dfb7dc45c3d0-kube-api-access-qfmz2\") pod \"control-plane-machine-set-operator-78cbb6b69f-w6lrr\" (UID: \"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.008321 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.008823 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.013265 4806 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8lcn2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.013320 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" podUID="c9d54745-0a0c-436a-8ead-26184660d59c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.028554 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zc9z\" (UniqueName: \"kubernetes.io/projected/77e696b4-bfbb-4600-8f7a-91772f7e8322-kube-api-access-4zc9z\") pod \"machine-config-operator-74547568cd-s45gp\" (UID: \"77e696b4-bfbb-4600-8f7a-91772f7e8322\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.029806 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.035607 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h29nb" event={"ID":"8625b175-0be3-4e29-bc64-09452e0f87ca","Type":"ContainerStarted","Data":"a3c5d8bcf5f299e5bdaa618b95d9c83040579e46c28a66369d9ba7355007948c"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.035836 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.042804 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" event={"ID":"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6","Type":"ContainerStarted","Data":"353105f3e7257fe4fd65d2d69e2e710a24e36335332144bae2e30cfcdb2e2074"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.042846 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" event={"ID":"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6","Type":"ContainerStarted","Data":"3eecf3afa45c07814679c0f47c8575322f6444f1c0aa8a8fbf0d30b649956132"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.045198 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.046104 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gfxc\" (UniqueName: \"kubernetes.io/projected/d4c76f4b-80c1-409a-acba-39a9edf0c975-kube-api-access-4gfxc\") pod \"marketplace-operator-79b997595-6nnnv\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.048212 4806 generic.go:334] "Generic (PLEG): container finished" podID="673f6847-2447-49d3-9e10-5b7ae3363435" containerID="1a51efa4cbf8572f455c7b0abad05d43b989d9ff018d1d5f4a1d81ae965bb9c9" exitCode=0 Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.049356 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" event={"ID":"673f6847-2447-49d3-9e10-5b7ae3363435","Type":"ContainerDied","Data":"1a51efa4cbf8572f455c7b0abad05d43b989d9ff018d1d5f4a1d81ae965bb9c9"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.049388 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" event={"ID":"673f6847-2447-49d3-9e10-5b7ae3363435","Type":"ContainerStarted","Data":"9d0f6812fdcfcf9586edb7b8c694744374f43ac330c50bd93b1b55719b258801"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.059045 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.059602 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.059885 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.559874501 +0000 UTC m=+143.090504902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.065134 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" event={"ID":"7b3651e3-2763-4f4b-a953-3ec65b52a8a7","Type":"ContainerStarted","Data":"e07a34c91df63076ee58dc323022d9e76262ce6d79c6aae0187d83011fe89465"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.065259 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ds4hl" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.073518 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.078859 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" event={"ID":"c8649291-1472-4f42-b8fe-447fa805d681","Type":"ContainerStarted","Data":"1b28f3d37449c77447b6210f7afff11c940504de36e6884baebc29323affd721"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.078929 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" event={"ID":"c8649291-1472-4f42-b8fe-447fa805d681","Type":"ContainerStarted","Data":"cae81647864eb59a0400a9b1c099446567af555c294e7e073493bbe33db0001d"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.078943 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" event={"ID":"c8649291-1472-4f42-b8fe-447fa805d681","Type":"ContainerStarted","Data":"76ba0fa81315312318e28c76af86b4c1bef2d879735eca1950b9771ccda94195"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.079085 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7jn9k" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.080892 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r4ldh" event={"ID":"6be3fc8f-849e-4d01-948a-46bc9ca06a05","Type":"ContainerStarted","Data":"b143fdfb4205add0a77635ca5bdf22afee558ddb1a7737e092569e8a71ef72df"} Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.082819 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6shw\" (UniqueName: \"kubernetes.io/projected/130d718f-ff56-44e7-87ab-f0c2b1a99e9b-kube-api-access-t6shw\") pod \"olm-operator-6b444d44fb-pg2dv\" (UID: \"130d718f-ff56-44e7-87ab-f0c2b1a99e9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.087005 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94c8\" (UniqueName: \"kubernetes.io/projected/b6086c5b-4528-4e20-b9a8-67b20b450516-kube-api-access-k94c8\") pod \"migrator-59844c95c7-pmw47\" (UID: \"b6086c5b-4528-4e20-b9a8-67b20b450516\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.088712 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.098878 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.099807 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.138268 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42n4r"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.161670 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.162073 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.662046806 +0000 UTC m=+143.192677217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.162588 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.164192 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.664184058 +0000 UTC m=+143.194814469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.189904 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.265074 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.265788 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.266026 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.766008204 +0000 UTC m=+143.296638615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.266147 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.266456 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.766444835 +0000 UTC m=+143.297075246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.280492 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.295009 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.307243 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4sj79"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.307329 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.323059 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.324691 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.367349 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.367480 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.867460902 +0000 UTC m=+143.398091303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.371867 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.372555 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.872286669 +0000 UTC m=+143.402917080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.381928 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dp9cm"] Feb 17 15:23:01 crc kubenswrapper[4806]: W0217 15:23:01.396727 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac7c661c_cf5d_418e_89d3_bc516cabd0e6.slice/crio-8266402e0280ff78961d7174045e3a0b86dab924757a6971bbfa0eca0dfaa755 WatchSource:0}: Error finding container 8266402e0280ff78961d7174045e3a0b86dab924757a6971bbfa0eca0dfaa755: Status 404 returned error can't find the container with id 8266402e0280ff78961d7174045e3a0b86dab924757a6971bbfa0eca0dfaa755 Feb 17 15:23:01 crc kubenswrapper[4806]: W0217 15:23:01.428117 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8464248c_a865_432e_86e0_d67bd9609645.slice/crio-44a5c5b4cd077650106dfe125c786921d603cbd2f0b515eebfe0483e8b52b83d WatchSource:0}: Error finding container 44a5c5b4cd077650106dfe125c786921d603cbd2f0b515eebfe0483e8b52b83d: Status 404 returned error can't find the container with id 44a5c5b4cd077650106dfe125c786921d603cbd2f0b515eebfe0483e8b52b83d Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.431482 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c4tb8"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.472746 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.473086 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.973025479 +0000 UTC m=+143.503655880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.473182 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.473563 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:01.973547702 +0000 UTC m=+143.504178113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.477375 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.478732 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.578529 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.579146 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.079087178 +0000 UTC m=+143.609717589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.579245 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.580070 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.080033261 +0000 UTC m=+143.610663672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.660421 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.672145 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.675545 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.682764 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.683523 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.183486767 +0000 UTC m=+143.714117188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.683778 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.684461 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lfqq8"] Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.692335 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.192285799 +0000 UTC m=+143.722916210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.761381 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-dhdkp" podStartSLOduration=121.761360956 podStartE2EDuration="2m1.761360956s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:01.75200693 +0000 UTC m=+143.282637341" watchObservedRunningTime="2026-02-17 15:23:01.761360956 +0000 UTC m=+143.291991367" Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.785156 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.786778 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.286759509 +0000 UTC m=+143.817389920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: W0217 15:23:01.820358 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod910f725b_6137_4ae8_b86f_9fc53af7d1ce.slice/crio-0df8629fa31d3bf1003c09d97a7d01a16afdf9fb957c59d22b0e086e405f3dc5 WatchSource:0}: Error finding container 0df8629fa31d3bf1003c09d97a7d01a16afdf9fb957c59d22b0e086e405f3dc5: Status 404 returned error can't find the container with id 0df8629fa31d3bf1003c09d97a7d01a16afdf9fb957c59d22b0e086e405f3dc5 Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.826388 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-clfv7"] Feb 17 15:23:01 crc kubenswrapper[4806]: W0217 15:23:01.863771 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c7556e_9967_49d8_aa86_a82a9a6eb29a.slice/crio-16e8d0dad7bbdd25db92eaa6221a1aa9ec6d6b1de23f8703eef0bb64d34f51ef WatchSource:0}: Error finding container 16e8d0dad7bbdd25db92eaa6221a1aa9ec6d6b1de23f8703eef0bb64d34f51ef: Status 404 returned error can't find the container with id 16e8d0dad7bbdd25db92eaa6221a1aa9ec6d6b1de23f8703eef0bb64d34f51ef Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.887978 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.888237 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.388226437 +0000 UTC m=+143.918856848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.927690 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-r5477"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.949610 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.961886 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.980556 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-28rqh"] Feb 17 15:23:01 crc kubenswrapper[4806]: I0217 15:23:01.989230 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:01 crc kubenswrapper[4806]: E0217 15:23:01.990163 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.490148446 +0000 UTC m=+144.020778857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.008905 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.092100 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.092811 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.592798943 +0000 UTC m=+144.123429354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.113623 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h29nb" event={"ID":"8625b175-0be3-4e29-bc64-09452e0f87ca","Type":"ContainerStarted","Data":"77937d4960c5b3c5eede3b4738739dadc4f5c926039c53095a8c45ada6922f0a"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.114978 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:02 crc kubenswrapper[4806]: W0217 15:23:02.122385 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074f20d5_eaad_4185_88d1_fae34a78e015.slice/crio-ab8eb3f39a2fe8212d95415802a284b68b08c74a4c20d9840beec94cab7e2679 WatchSource:0}: Error finding container ab8eb3f39a2fe8212d95415802a284b68b08c74a4c20d9840beec94cab7e2679: Status 404 returned error can't find the container with id ab8eb3f39a2fe8212d95415802a284b68b08c74a4c20d9840beec94cab7e2679 Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.149389 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" event={"ID":"c335350f-90c3-4c01-9b58-423e540ea120","Type":"ContainerStarted","Data":"6b338b662b9cdb8c4f497dd3705b65501458763d69e4a5c7c47fb0312c52bd02"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.149630 4806 patch_prober.go:28] interesting pod/console-operator-58897d9998-h29nb container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.149712 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h29nb" podUID="8625b175-0be3-4e29-bc64-09452e0f87ca" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.185194 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" event={"ID":"910f725b-6137-4ae8-b86f-9fc53af7d1ce","Type":"ContainerStarted","Data":"0df8629fa31d3bf1003c09d97a7d01a16afdf9fb957c59d22b0e086e405f3dc5"} Feb 17 15:23:02 crc kubenswrapper[4806]: W0217 15:23:02.185495 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea7524e_6205_4b23_bec9_028f0ebe3cf2.slice/crio-a810f17960e8547161fa111e387a83fbcb8fc181aed2fa074761a9b17937b9c7 WatchSource:0}: Error finding container a810f17960e8547161fa111e387a83fbcb8fc181aed2fa074761a9b17937b9c7: Status 404 returned error can't find the container with id a810f17960e8547161fa111e387a83fbcb8fc181aed2fa074761a9b17937b9c7 Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.199726 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.201612 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.701588527 +0000 UTC m=+144.232218938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.228266 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ds4hl" event={"ID":"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481","Type":"ContainerStarted","Data":"f5ef09157c5ad872cdb2c3cf4aef120f852a45c8b79882770f46e2b05d3a0f50"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.250308 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" event={"ID":"0e9e9afe-0548-44f0-a905-4ba3c9aa16af","Type":"ContainerStarted","Data":"f78737dc972bdc7438fae1dfb1c36100227a4f41c2fd719c8e79fb6f8108c3a9"} Feb 17 15:23:02 crc kubenswrapper[4806]: W0217 15:23:02.270552 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46737ae1_a5eb_453f_aa74_2af76d30d7c3.slice/crio-bfb44a42059ac5fd3874e1f995e664c2e5b72d3ef70c1aa11bb82244e85f42d2 WatchSource:0}: Error finding container bfb44a42059ac5fd3874e1f995e664c2e5b72d3ef70c1aa11bb82244e85f42d2: Status 404 returned error can't find the container with id bfb44a42059ac5fd3874e1f995e664c2e5b72d3ef70c1aa11bb82244e85f42d2 Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.288867 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" event={"ID":"c0c59008-76e7-4196-9ef3-001a9be8bc7d","Type":"ContainerStarted","Data":"39248f2a20cf919c89cbe316d500078294e79e8a365a589477cc6af1592b7c6f"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.304976 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.305712 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.805696449 +0000 UTC m=+144.336326860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.334845 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" event={"ID":"435b5b72-d2ad-4d48-94a8-ed1da78cb5c6","Type":"ContainerStarted","Data":"ddd303942223545fde3d234bd58ff681c43431e67d50843009b1723fdd89aa7c"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.379964 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.396029 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r4ldh" event={"ID":"6be3fc8f-849e-4d01-948a-46bc9ca06a05","Type":"ContainerStarted","Data":"1630573b085a6530309dd90325935d41d87d1193190c19a4cf336b4dd477a82b"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.406064 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" event={"ID":"87968279-7e35-4a0a-b1a2-bbdd91ea184d","Type":"ContainerStarted","Data":"a88443376cc3b0bd606c9d25842bc8bdaa1659eb4dc70bf94dc15cc668d8ef8f"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.407552 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" event={"ID":"cd80c0d6-5372-4ad5-a5e1-45ecad939749","Type":"ContainerStarted","Data":"4ff859845860027b7eacbbbd71c7fdef5b82a316139a984a5f58569d29035d47"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.410500 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.413855 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:02.913818227 +0000 UTC m=+144.444448638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.435361 4806 generic.go:334] "Generic (PLEG): container finished" podID="7b3651e3-2763-4f4b-a953-3ec65b52a8a7" containerID="ec71fc98da560a71a32520df73e095940aeb5ec129a663946c07ca2828f80fc5" exitCode=0 Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.435538 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" event={"ID":"7b3651e3-2763-4f4b-a953-3ec65b52a8a7","Type":"ContainerDied","Data":"ec71fc98da560a71a32520df73e095940aeb5ec129a663946c07ca2828f80fc5"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.470384 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" event={"ID":"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0","Type":"ContainerStarted","Data":"752d45842936939440ae5131cc3bf3165a08c10235961c9d79af9007e6b3620a"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.501820 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4sj79" event={"ID":"ac7c661c-cf5d-418e-89d3-bc516cabd0e6","Type":"ContainerStarted","Data":"8266402e0280ff78961d7174045e3a0b86dab924757a6971bbfa0eca0dfaa755"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.508398 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" event={"ID":"9131411b-b7d0-47b5-a4a5-ce289282d5c3","Type":"ContainerStarted","Data":"f2da644df7e853e5e62146db1cab6085cdd1f8bb6cfe11ad5086144972038662"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.514777 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.515706 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" event={"ID":"58f729fc-8ddb-43b2-897b-23fffea83c73","Type":"ContainerStarted","Data":"4d387fa20a40567aa5775bf145818d9fd7488ae1d6c3562bc5e9e69e9209f36c"} Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.522818 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.022800406 +0000 UTC m=+144.553430817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.531818 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5nlh8"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.535933 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" event={"ID":"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7","Type":"ContainerStarted","Data":"e7de5cd88b420e0d022866ed651d11216231108d1f5256eca0e23eead152a732"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.536283 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.547458 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-clfv7" event={"ID":"45c1d170-0968-44d2-b9cd-5dcd8732afc3","Type":"ContainerStarted","Data":"bcf922eea66c877b849d9d7b2bb36b2afd63ad4760365478bfb57b907bbe90dd"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.561877 4806 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rbxx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.561937 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" podUID="7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.567682 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.580601 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" event={"ID":"f8c7556e-9967-49d8-aa86-a82a9a6eb29a","Type":"ContainerStarted","Data":"16e8d0dad7bbdd25db92eaa6221a1aa9ec6d6b1de23f8703eef0bb64d34f51ef"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.585276 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" event={"ID":"10565cb3-8e68-4dd9-9bac-fc770b23825b","Type":"ContainerStarted","Data":"86d7a829cb2b38e776164deaff9d02e2f5c22a87db54d35be6fd68af132bf088"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.592638 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" event={"ID":"8464248c-a865-432e-86e0-d67bd9609645","Type":"ContainerStarted","Data":"44a5c5b4cd077650106dfe125c786921d603cbd2f0b515eebfe0483e8b52b83d"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.604517 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.605279 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" event={"ID":"1e510f3a-7afd-4c62-92a4-e898a6b635fe","Type":"ContainerStarted","Data":"3858ea717c1caa04eb927fb589d72f96ce362350d7d47f83f41fec50363c883d"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.612616 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:02 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:02 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:02 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.612670 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.615574 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.617523 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.117392339 +0000 UTC m=+144.648022750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.619233 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-x5v75"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.622588 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" event={"ID":"954c1de6-6017-448a-addb-5fdc73d0987b","Type":"ContainerStarted","Data":"be0d98b3a714dec53c5fa58d5c495d6045207f439af2c032b6336c4af2e30744"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.622652 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" event={"ID":"954c1de6-6017-448a-addb-5fdc73d0987b","Type":"ContainerStarted","Data":"a1408f2c2fe7d4af03907b336c6bf84e9f81d66daf646ff5e1e0816a0cb73248"} Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.633897 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.635412 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gb5xj"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.720976 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.722990 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.222973336 +0000 UTC m=+144.753603737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.758855 4806 csr.go:261] certificate signing request csr-qtlbn is approved, waiting to be issued Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.778573 4806 csr.go:257] certificate signing request csr-qtlbn is issued Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.804087 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" podStartSLOduration=122.804062973 podStartE2EDuration="2m2.804062973s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:02.802918735 +0000 UTC m=+144.333549166" watchObservedRunningTime="2026-02-17 15:23:02.804062973 +0000 UTC m=+144.334693384" Feb 17 15:23:02 crc kubenswrapper[4806]: W0217 15:23:02.805144 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42513bc4_7e15_4e8f_b1aa_006b42a10ff4.slice/crio-1b266657ae8c0e1c845530efa8fdf6c1d99a8a0baca0fca476c5b8754face156 WatchSource:0}: Error finding container 1b266657ae8c0e1c845530efa8fdf6c1d99a8a0baca0fca476c5b8754face156: Status 404 returned error can't find the container with id 1b266657ae8c0e1c845530efa8fdf6c1d99a8a0baca0fca476c5b8754face156 Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.821394 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.822083 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.822513 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.322495397 +0000 UTC m=+144.853125808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.830602 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv"] Feb 17 15:23:02 crc kubenswrapper[4806]: W0217 15:23:02.844024 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15adb8fa_833b_4335_9788_50d5bb34e14d.slice/crio-621f04c2a50bc5177e16940b7141976b2b3a9f0a8bbbe0645ab39af4c70f468c WatchSource:0}: Error finding container 621f04c2a50bc5177e16940b7141976b2b3a9f0a8bbbe0645ab39af4c70f468c: Status 404 returned error can't find the container with id 621f04c2a50bc5177e16940b7141976b2b3a9f0a8bbbe0645ab39af4c70f468c Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.848614 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7jn9k"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.855719 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.893046 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.921798 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nnnv"] Feb 17 15:23:02 crc kubenswrapper[4806]: I0217 15:23:02.924165 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:02 crc kubenswrapper[4806]: E0217 15:23:02.924689 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.424669533 +0000 UTC m=+144.955299944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.025288 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.025543 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.525509235 +0000 UTC m=+145.056139646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.025769 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.026441 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.526431028 +0000 UTC m=+145.057061439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.095036 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r4ldh" podStartSLOduration=123.095011052 podStartE2EDuration="2m3.095011052s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.094174372 +0000 UTC m=+144.624804793" watchObservedRunningTime="2026-02-17 15:23:03.095011052 +0000 UTC m=+144.625641463" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.127540 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.128020 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.628001558 +0000 UTC m=+145.158631969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.157328 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" podStartSLOduration=123.157309165 podStartE2EDuration="2m3.157309165s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.13885018 +0000 UTC m=+144.669480611" watchObservedRunningTime="2026-02-17 15:23:03.157309165 +0000 UTC m=+144.687939576" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.158132 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-czntk" podStartSLOduration=123.158125415 podStartE2EDuration="2m3.158125415s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.156575348 +0000 UTC m=+144.687205759" watchObservedRunningTime="2026-02-17 15:23:03.158125415 +0000 UTC m=+144.688755826" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.233029 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.233842 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.733827232 +0000 UTC m=+145.264457643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.335186 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.335592 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.835578167 +0000 UTC m=+145.366208578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.380057 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-h29nb" podStartSLOduration=123.380039199 podStartE2EDuration="2m3.380039199s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.374948527 +0000 UTC m=+144.905578948" watchObservedRunningTime="2026-02-17 15:23:03.380039199 +0000 UTC m=+144.910669600" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.381554 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-k8m74" podStartSLOduration=124.381547516 podStartE2EDuration="2m4.381547516s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.336347515 +0000 UTC m=+144.866977926" watchObservedRunningTime="2026-02-17 15:23:03.381547516 +0000 UTC m=+144.912177927" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.404557 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" podStartSLOduration=124.40453425 podStartE2EDuration="2m4.40453425s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.397852789 +0000 UTC m=+144.928483210" watchObservedRunningTime="2026-02-17 15:23:03.40453425 +0000 UTC m=+144.935164661" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.437028 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.437725 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:03.937698341 +0000 UTC m=+145.468328752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.540014 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.540322 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.040306266 +0000 UTC m=+145.570936677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.614061 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:03 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:03 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:03 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.614109 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.658545 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.658938 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.158924548 +0000 UTC m=+145.689554959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.685167 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" event={"ID":"42513bc4-7e15-4e8f-b1aa-006b42a10ff4","Type":"ContainerStarted","Data":"1b266657ae8c0e1c845530efa8fdf6c1d99a8a0baca0fca476c5b8754face156"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.693518 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ds4hl" event={"ID":"5dce6f5c-d2cb-4d42-9635-8dc6d4f10481","Type":"ContainerStarted","Data":"2fbc50636bc2ec8c5196861845d902924a87eb22ddd24ad6cbfd7d7b8c7885fd"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.697367 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" event={"ID":"77e696b4-bfbb-4600-8f7a-91772f7e8322","Type":"ContainerStarted","Data":"44554005188ef7a719418e3f6f33a9b782047345f658780606330c6a1dde9b42"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.712769 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" event={"ID":"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0","Type":"ContainerStarted","Data":"a704f5b1e202d104df391d8372ad87408a78582cfee3a25e0a71e91d05ef05d3"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.715003 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" event={"ID":"1e510f3a-7afd-4c62-92a4-e898a6b635fe","Type":"ContainerStarted","Data":"7325d8e85ab3c3dd2fb3150f1c139ac7f9cf0b8387f2b237f0585eb7359243dd"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.731130 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ds4hl" podStartSLOduration=6.7311114 podStartE2EDuration="6.7311114s" podCreationTimestamp="2026-02-17 15:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.729627694 +0000 UTC m=+145.260258105" watchObservedRunningTime="2026-02-17 15:23:03.7311114 +0000 UTC m=+145.261741811" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.755214 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" event={"ID":"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7","Type":"ContainerStarted","Data":"7872f25bf5a389c2766d24e2c52cb0177d4107f9ebcd13052b41da528708a628"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.759120 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.759907 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.259894174 +0000 UTC m=+145.790524575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.781314 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" event={"ID":"9131411b-b7d0-47b5-a4a5-ce289282d5c3","Type":"ContainerStarted","Data":"26fbc5b53fafba7d8a80dcb6d1b40ee666b24b2d7da88daa3f0fbdd981c877c6"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.781494 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 15:18:02 +0000 UTC, rotation deadline is 2027-01-11 00:51:44.808981326 +0000 UTC Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.781551 4806 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7857h28m41.027432749s for next certificate rotation Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.789677 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" event={"ID":"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0","Type":"ContainerStarted","Data":"745a59fab0d4598009062e6e2d172128024dcff42b1dcd8b83e56daa22456ed6"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.817867 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lz4lh" podStartSLOduration=123.817843372 podStartE2EDuration="2m3.817843372s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.817058753 +0000 UTC m=+145.347689174" watchObservedRunningTime="2026-02-17 15:23:03.817843372 +0000 UTC m=+145.348473783" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.818744 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" event={"ID":"cd80c0d6-5372-4ad5-a5e1-45ecad939749","Type":"ContainerStarted","Data":"82b56923e1db496649496e056fbc092639aa6007cd990030a8bdd6b8493c81a0"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.835530 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" event={"ID":"0e9e9afe-0548-44f0-a905-4ba3c9aa16af","Type":"ContainerStarted","Data":"2c7f7cc00ca54eaf88661317ffd46fb8b9a7776a70b3d29d81779da734d7f926"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.862698 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" event={"ID":"defc246b-6f58-4f10-82c1-a07c2ef017ca","Type":"ContainerStarted","Data":"935c3e1249498f3cf4f16428bcb394924199cb9aa4a9313e0e0aa44d5f0ee75e"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.862747 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" event={"ID":"defc246b-6f58-4f10-82c1-a07c2ef017ca","Type":"ContainerStarted","Data":"a4ffe1d7e38634602350969eef206680b0bf9c67b9a03577bd2267f127cd1848"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.874306 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" event={"ID":"46737ae1-a5eb-453f-aa74-2af76d30d7c3","Type":"ContainerStarted","Data":"bfb44a42059ac5fd3874e1f995e664c2e5b72d3ef70c1aa11bb82244e85f42d2"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.879326 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.879986 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" event={"ID":"4a3ff543-139c-48f8-a201-103c00c8b23e","Type":"ContainerStarted","Data":"512be63cba5e583225bbe6a5d6da094a3242864ad9b3644994b69565e1f3cd86"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.885373 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-clfv7" event={"ID":"45c1d170-0968-44d2-b9cd-5dcd8732afc3","Type":"ContainerStarted","Data":"a2f0db3a5548ab1c93cdddf92c3212613b143933527fae702e9990e8abfe809a"} Feb 17 15:23:03 crc kubenswrapper[4806]: E0217 15:23:03.885709 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.385693569 +0000 UTC m=+145.916323980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.886397 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-clfv7" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.887741 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7jn9k" event={"ID":"18fcb65b-e08a-4c4b-b8c3-d474117395b5","Type":"ContainerStarted","Data":"528bf4fc6c653bb96ad612eac6269985b1a16addff8d0f97e824624f8af8fb62"} Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.906701 4806 patch_prober.go:28] interesting pod/downloads-7954f5f757-clfv7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.906746 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clfv7" podUID="45c1d170-0968-44d2-b9cd-5dcd8732afc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.907468 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-42n4r" podStartSLOduration=123.907456865 podStartE2EDuration="2m3.907456865s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.906968663 +0000 UTC m=+145.437599084" watchObservedRunningTime="2026-02-17 15:23:03.907456865 +0000 UTC m=+145.438087276" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.907888 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kkbn8" podStartSLOduration=123.907884475 podStartE2EDuration="2m3.907884475s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:03.84302111 +0000 UTC m=+145.373651541" watchObservedRunningTime="2026-02-17 15:23:03.907884475 +0000 UTC m=+145.438514886" Feb 17 15:23:03 crc kubenswrapper[4806]: I0217 15:23:03.911470 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6mpm9" event={"ID":"58f729fc-8ddb-43b2-897b-23fffea83c73","Type":"ContainerStarted","Data":"ffda02f74ab2154feb0adfc3a5439fedff7dd2fc45ef4f488a83cb4edf8a6cda"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.010651 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" event={"ID":"10565cb3-8e68-4dd9-9bac-fc770b23825b","Type":"ContainerStarted","Data":"ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.011839 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.012386 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.013875 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.513858012 +0000 UTC m=+146.044488423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.026636 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-clfv7" podStartSLOduration=124.02661435 podStartE2EDuration="2m4.02661435s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.014382204 +0000 UTC m=+145.545012625" watchObservedRunningTime="2026-02-17 15:23:04.02661435 +0000 UTC m=+145.557244761" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.030014 4806 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-dp9cm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.030078 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" podUID="10565cb3-8e68-4dd9-9bac-fc770b23825b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.030811 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" event={"ID":"76315151-b675-410e-9ed9-8e39ebd883b3","Type":"ContainerStarted","Data":"82735a3c28ba6c88d74bf0a67597efe270d3becf6839d2465f82f1f47220ffd8"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.038189 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" event={"ID":"4828e844-f021-4591-ab25-ca198d3e577b","Type":"ContainerStarted","Data":"13b6e45c318be58caa377c5724b4d57c57801a6b5ff268f69768e906f4d7f07f"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.076665 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" event={"ID":"074f20d5-eaad-4185-88d1-fae34a78e015","Type":"ContainerStarted","Data":"3fda491637b13691c71952cf8c9a4f67cfc75d0664f1734c2eb0ade99a040095"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.077198 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" event={"ID":"074f20d5-eaad-4185-88d1-fae34a78e015","Type":"ContainerStarted","Data":"ab8eb3f39a2fe8212d95415802a284b68b08c74a4c20d9840beec94cab7e2679"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.076778 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r5477" podStartSLOduration=124.076762889 podStartE2EDuration="2m4.076762889s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.074652749 +0000 UTC m=+145.605283170" watchObservedRunningTime="2026-02-17 15:23:04.076762889 +0000 UTC m=+145.607393290" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.077903 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.091205 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" event={"ID":"15adb8fa-833b-4335-9788-50d5bb34e14d","Type":"ContainerStarted","Data":"621f04c2a50bc5177e16940b7141976b2b3a9f0a8bbbe0645ab39af4c70f468c"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.103579 4806 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bxsmr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.103666 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" podUID="074f20d5-eaad-4185-88d1-fae34a78e015" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.105427 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" event={"ID":"8464248c-a865-432e-86e0-d67bd9609645","Type":"ContainerStarted","Data":"a99eb6183dfe7a617bfe68b2048d23ba252384b7c207310cb2a58a01f2bc55ae"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.121337 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.121695 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.621682783 +0000 UTC m=+146.152313194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.163302 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4sj79" event={"ID":"ac7c661c-cf5d-418e-89d3-bc516cabd0e6","Type":"ContainerStarted","Data":"2e26a0365e37f303a4e97401f5e9da29a6676fd812c89c601bd2e27175fa6c46"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.177697 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" podStartSLOduration=125.177678944 podStartE2EDuration="2m5.177678944s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.160842208 +0000 UTC m=+145.691472639" watchObservedRunningTime="2026-02-17 15:23:04.177678944 +0000 UTC m=+145.708309355" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.197928 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" event={"ID":"b6086c5b-4528-4e20-b9a8-67b20b450516","Type":"ContainerStarted","Data":"42700e4918a04490b3075a575e925c9194c47231281b1aad609cb105bf290792"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.221631 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.222577 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.722554817 +0000 UTC m=+146.253185228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.242496 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" event={"ID":"f8c7556e-9967-49d8-aa86-a82a9a6eb29a","Type":"ContainerStarted","Data":"9667e7c483b87fb7bffc87c3b77a133687afe0f51beae7da2278dbb30e2fd447"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.246205 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" event={"ID":"d4c76f4b-80c1-409a-acba-39a9edf0c975","Type":"ContainerStarted","Data":"98dd418d592b1751afbd4350e6aa635aa46b4b5dcdf1201100e105b7a5406ccf"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.278452 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" event={"ID":"673f6847-2447-49d3-9e10-5b7ae3363435","Type":"ContainerStarted","Data":"42b3cb6881a4efca40fd1b730d66edc75a5a83ee0f9918044f5a4b6c9d203940"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.291908 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" podStartSLOduration=124.29188874 podStartE2EDuration="2m4.29188874s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.225124019 +0000 UTC m=+145.755754430" watchObservedRunningTime="2026-02-17 15:23:04.29188874 +0000 UTC m=+145.822519141" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.292349 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" event={"ID":"c335350f-90c3-4c01-9b58-423e540ea120","Type":"ContainerStarted","Data":"de0e7500f0e3dc8278970ebe37ca6d037fb230a16c01a06c7e142facb22e5de7"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.324038 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.326542 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.826526876 +0000 UTC m=+146.357157287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.338658 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.338923 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" event={"ID":"910f725b-6137-4ae8-b86f-9fc53af7d1ce","Type":"ContainerStarted","Data":"587c5549d19ab2ecf42aa1fe15b7ba1b109446877ddd41d5d61e193b29da239e"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.359677 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" podStartSLOduration=124.359660295 podStartE2EDuration="2m4.359660295s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.300250102 +0000 UTC m=+145.830880523" watchObservedRunningTime="2026-02-17 15:23:04.359660295 +0000 UTC m=+145.890290706" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.360142 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tjb2d" podStartSLOduration=124.360138507 podStartE2EDuration="2m4.360138507s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.354953021 +0000 UTC m=+145.885583432" watchObservedRunningTime="2026-02-17 15:23:04.360138507 +0000 UTC m=+145.890768918" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.414224 4806 generic.go:334] "Generic (PLEG): container finished" podID="954c1de6-6017-448a-addb-5fdc73d0987b" containerID="be0d98b3a714dec53c5fa58d5c495d6045207f439af2c032b6336c4af2e30744" exitCode=0 Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.414285 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" event={"ID":"954c1de6-6017-448a-addb-5fdc73d0987b","Type":"ContainerDied","Data":"be0d98b3a714dec53c5fa58d5c495d6045207f439af2c032b6336c4af2e30744"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.414738 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.425658 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.427160 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:04.927145653 +0000 UTC m=+146.457776064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.475001 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" event={"ID":"87968279-7e35-4a0a-b1a2-bbdd91ea184d","Type":"ContainerStarted","Data":"ccee94790c8a47c29aafd7d0121165e9f4a3faea9d68beaf615ec78544970974"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.488000 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" event={"ID":"bea7524e-6205-4b23-bec9-028f0ebe3cf2","Type":"ContainerStarted","Data":"27e3234a8628a98b6d139ce43340f30f13fa246c72f77f84c06df26f6fdb9d12"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.488046 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" event={"ID":"bea7524e-6205-4b23-bec9-028f0ebe3cf2","Type":"ContainerStarted","Data":"a810f17960e8547161fa111e387a83fbcb8fc181aed2fa074761a9b17937b9c7"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.504972 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lfqq8" podStartSLOduration=125.504954451 podStartE2EDuration="2m5.504954451s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.448833317 +0000 UTC m=+145.979463738" watchObservedRunningTime="2026-02-17 15:23:04.504954451 +0000 UTC m=+146.035584862" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.536908 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.539104 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.039083504 +0000 UTC m=+146.569713915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.554650 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" event={"ID":"130d718f-ff56-44e7-87ab-f0c2b1a99e9b","Type":"ContainerStarted","Data":"57e2d0df510a347c25cb3d52cfc3171291eade6b96a6f049368a995e19a28e5d"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.557489 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.591836 4806 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pg2dv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.591934 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5nlh8" event={"ID":"25b773c8-e4fa-4b3c-ab59-74105f1296af","Type":"ContainerStarted","Data":"900040423f32e0864e1726e2942b5965b6cec5ce331f5db059fdf2ce7f300a95"} Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.591928 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" podUID="130d718f-ff56-44e7-87ab-f0c2b1a99e9b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.607750 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:04 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:04 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:04 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.607795 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.616120 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-h29nb" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.632353 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pht7c" podStartSLOduration=124.632333154 podStartE2EDuration="2m4.632333154s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.628225055 +0000 UTC m=+146.158855466" watchObservedRunningTime="2026-02-17 15:23:04.632333154 +0000 UTC m=+146.162963565" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.633586 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4sj79" podStartSLOduration=124.633580854 podStartE2EDuration="2m4.633580854s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.55137256 +0000 UTC m=+146.082002981" watchObservedRunningTime="2026-02-17 15:23:04.633580854 +0000 UTC m=+146.164211265" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.637322 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.638115 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.138101023 +0000 UTC m=+146.668731444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.706638 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" podStartSLOduration=124.706622356 podStartE2EDuration="2m4.706622356s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.670261809 +0000 UTC m=+146.200892220" watchObservedRunningTime="2026-02-17 15:23:04.706622356 +0000 UTC m=+146.237252767" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.733234 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" podStartSLOduration=125.733215168 podStartE2EDuration="2m5.733215168s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.706880682 +0000 UTC m=+146.237511093" watchObservedRunningTime="2026-02-17 15:23:04.733215168 +0000 UTC m=+146.263845579" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.745396 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.745857 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.245843863 +0000 UTC m=+146.776474274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.770182 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" podStartSLOduration=124.770158649 podStartE2EDuration="2m4.770158649s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:04.762986646 +0000 UTC m=+146.293617067" watchObservedRunningTime="2026-02-17 15:23:04.770158649 +0000 UTC m=+146.300789050" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.792579 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.792635 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.847311 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.847684 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.347669999 +0000 UTC m=+146.878300410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:04 crc kubenswrapper[4806]: I0217 15:23:04.948725 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:04 crc kubenswrapper[4806]: E0217 15:23:04.949569 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.449558038 +0000 UTC m=+146.980188449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.049927 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.050206 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.550192026 +0000 UTC m=+147.080822437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.151520 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.151922 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.65190458 +0000 UTC m=+147.182534991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.252552 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.252893 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.752870286 +0000 UTC m=+147.283500697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.353970 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.354734 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.854722453 +0000 UTC m=+147.385352864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.455988 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.456304 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:05.956285494 +0000 UTC m=+147.486915905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.557862 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.558155 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.058144161 +0000 UTC m=+147.588774572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.599320 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7jn9k" event={"ID":"18fcb65b-e08a-4c4b-b8c3-d474117395b5","Type":"ContainerStarted","Data":"e188db53584a268f94404b0efa0197e9cdab4d2ca798b0e02e4ec3267341e71b"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.602272 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" event={"ID":"f8c7556e-9967-49d8-aa86-a82a9a6eb29a","Type":"ContainerStarted","Data":"e832e23338d9246f860ef64c2104c0db6110f891764c770f43962fcf8c2f6567"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.604052 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" event={"ID":"46737ae1-a5eb-453f-aa74-2af76d30d7c3","Type":"ContainerStarted","Data":"403ebec23a799dbd9fa1175801e9ffa519ecc5568941cb4dcba726534ae60419"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.605178 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:05 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:05 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:05 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.605218 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.605373 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" event={"ID":"76315151-b675-410e-9ed9-8e39ebd883b3","Type":"ContainerStarted","Data":"e4a7faeea1604a5799afeaac65dedb0c152f5268b392ad4644aab797a226330f"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.605429 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" event={"ID":"76315151-b675-410e-9ed9-8e39ebd883b3","Type":"ContainerStarted","Data":"c171d91b6816bd6fd55db15c725b3c7b18f5d17d8f21f39b8ef057c56ff4f0c7"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.605572 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.607063 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" event={"ID":"673f6847-2447-49d3-9e10-5b7ae3363435","Type":"ContainerStarted","Data":"76464dc35cebb84e68c4755408a718afcdece494a741d91e6bdca1b6f91cfaf6"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.608450 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" event={"ID":"4efc9c9c-8be8-41de-b524-dfb7dc45c3d0","Type":"ContainerStarted","Data":"4bf270925da27bb5ac4be143c8e622df20f13e9ac0f0ab0713577460bda66717"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.610102 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" event={"ID":"7b3651e3-2763-4f4b-a953-3ec65b52a8a7","Type":"ContainerStarted","Data":"73a18cd40508d16fd827c96b7bb7f47b43f0de9b01bd4b21c4507f6d25c26f7e"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.613542 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" event={"ID":"954c1de6-6017-448a-addb-5fdc73d0987b","Type":"ContainerStarted","Data":"be455d34b07062d2e8cf3fa806e357a6c791f4fa64fec86b2d9ca8d0a1161a5a"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.616190 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" event={"ID":"130d718f-ff56-44e7-87ab-f0c2b1a99e9b","Type":"ContainerStarted","Data":"d6c0985e698b2e59e9af133eceb07b3eb4ac8e64a33105e056d3070f31c509cb"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.617221 4806 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pg2dv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.617277 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" podUID="130d718f-ff56-44e7-87ab-f0c2b1a99e9b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.619004 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" event={"ID":"4a3ff543-139c-48f8-a201-103c00c8b23e","Type":"ContainerStarted","Data":"bd019b84bb9a65aebdfae6b90b3ba2d319a25f67cd084ca04cd66b15b701f7ee"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.619333 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.620697 4806 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-52z2c container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.620737 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" podUID="4a3ff543-139c-48f8-a201-103c00c8b23e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.622007 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" event={"ID":"2ea4db6d-8ae8-441a-9a97-fb4e2ca4ddc0","Type":"ContainerStarted","Data":"856cf4b046604a9e607c31791b8bd5b881b14a15d8dfe8d3429f13a8303b249c"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.623637 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-x5v75" event={"ID":"15adb8fa-833b-4335-9788-50d5bb34e14d","Type":"ContainerStarted","Data":"d8031304940fad18cfe9603b92b56ef978a555bca7ec960e525eb853d10a2a74"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.625955 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" event={"ID":"d4c76f4b-80c1-409a-acba-39a9edf0c975","Type":"ContainerStarted","Data":"b162d3bd96a4774cbb7c48533c1b2121c9a67d3d850962b55ffd80f8352d3062"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.626181 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.628417 4806 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6nnnv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.628473 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" podUID="d4c76f4b-80c1-409a-acba-39a9edf0c975" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.628591 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" event={"ID":"b6086c5b-4528-4e20-b9a8-67b20b450516","Type":"ContainerStarted","Data":"f86f37b14e5e4a91a5d56cdca8639f316c1b63a3057857ad8f3e6943e7a5d6f2"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.628615 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" event={"ID":"b6086c5b-4528-4e20-b9a8-67b20b450516","Type":"ContainerStarted","Data":"5539485e78854d834ed5b0b7229ca1eeb80d0292771058051330761538270b65"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.631366 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" event={"ID":"77e696b4-bfbb-4600-8f7a-91772f7e8322","Type":"ContainerStarted","Data":"82bd6fe9bf0ce7f9e13d36736c3491804cbc036a00d5edd1957e604bc95527a4"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.631393 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" event={"ID":"77e696b4-bfbb-4600-8f7a-91772f7e8322","Type":"ContainerStarted","Data":"12203d264bbf0970b87a7e8b7b7132bbefb8bfb7ed55371ef749aa9496d93f53"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.634207 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" event={"ID":"1e510f3a-7afd-4c62-92a4-e898a6b635fe","Type":"ContainerStarted","Data":"935b86ffd0b0212b9682f402276e6b2292ad14d58551bd749a766b92d6033d03"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.637582 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5nlh8" event={"ID":"25b773c8-e4fa-4b3c-ab59-74105f1296af","Type":"ContainerStarted","Data":"abf3987b0b0b044b4a155802d0a56a42de022683de1ddeda8c5a2f1cd4202bed"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.637622 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5nlh8" event={"ID":"25b773c8-e4fa-4b3c-ab59-74105f1296af","Type":"ContainerStarted","Data":"4d1916de7451896f0f409c5231f4641648c24c3c99b2c2369ae73e79cc5eb841"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.638039 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.639473 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" event={"ID":"42513bc4-7e15-4e8f-b1aa-006b42a10ff4","Type":"ContainerStarted","Data":"d772d74996807ff95eda595c12707ca629963cefb674d704b12b6a3de2c5697e"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.650234 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" event={"ID":"4828e844-f021-4591-ab25-ca198d3e577b","Type":"ContainerStarted","Data":"96c4fd04bdf8427dedd1a89233af34fd2e638bb565f18a40ad0e6eb6bcf1da6d"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.650278 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" event={"ID":"4828e844-f021-4591-ab25-ca198d3e577b","Type":"ContainerStarted","Data":"ef3ee6c7089da6e60fa2a925f524b004c7f5dc04c1de6c5432c5e5b486cc9af1"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.651642 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7jn9k" podStartSLOduration=7.651625407 podStartE2EDuration="7.651625407s" podCreationTimestamp="2026-02-17 15:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:05.644340361 +0000 UTC m=+147.174970772" watchObservedRunningTime="2026-02-17 15:23:05.651625407 +0000 UTC m=+147.182255808" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.655422 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" event={"ID":"87968279-7e35-4a0a-b1a2-bbdd91ea184d","Type":"ContainerStarted","Data":"6b26a326c49267b9ea8c23e47d883c9c5f430818d56e16695afaea09658b6dc6"} Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.656909 4806 patch_prober.go:28] interesting pod/downloads-7954f5f757-clfv7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.656951 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clfv7" podUID="45c1d170-0968-44d2-b9cd-5dcd8732afc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.658421 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.661603 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.161579417 +0000 UTC m=+147.692209828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.687636 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.770654 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.781493 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.281476 +0000 UTC m=+147.812106411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.839157 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5nlh8" podStartSLOduration=8.839136290999999 podStartE2EDuration="8.839136291s" podCreationTimestamp="2026-02-17 15:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:05.709592465 +0000 UTC m=+147.240222896" watchObservedRunningTime="2026-02-17 15:23:05.839136291 +0000 UTC m=+147.369766702" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.877905 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.878274 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.378258025 +0000 UTC m=+147.908888436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.942379 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" podStartSLOduration=125.94235805 podStartE2EDuration="2m5.94235805s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:05.788382806 +0000 UTC m=+147.319013227" watchObservedRunningTime="2026-02-17 15:23:05.94235805 +0000 UTC m=+147.472988461" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.942622 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" podStartSLOduration=126.942619037 podStartE2EDuration="2m6.942619037s" podCreationTimestamp="2026-02-17 15:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:05.941190542 +0000 UTC m=+147.471820963" watchObservedRunningTime="2026-02-17 15:23:05.942619037 +0000 UTC m=+147.473249448" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.974942 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" podStartSLOduration=125.974923866 podStartE2EDuration="2m5.974923866s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:05.970369346 +0000 UTC m=+147.500999767" watchObservedRunningTime="2026-02-17 15:23:05.974923866 +0000 UTC m=+147.505554287" Feb 17 15:23:05 crc kubenswrapper[4806]: I0217 15:23:05.979272 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:05 crc kubenswrapper[4806]: E0217 15:23:05.979706 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.479694411 +0000 UTC m=+148.010324822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.025128 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-r7shr" podStartSLOduration=126.025106337 podStartE2EDuration="2m6.025106337s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.024724988 +0000 UTC m=+147.555355409" watchObservedRunningTime="2026-02-17 15:23:06.025106337 +0000 UTC m=+147.555736748" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.048268 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s45gp" podStartSLOduration=126.048254555 podStartE2EDuration="2m6.048254555s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.043582123 +0000 UTC m=+147.574212534" watchObservedRunningTime="2026-02-17 15:23:06.048254555 +0000 UTC m=+147.578884966" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.081039 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.081387 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.581369144 +0000 UTC m=+148.111999555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.106134 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" podStartSLOduration=126.106119341 podStartE2EDuration="2m6.106119341s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.103869397 +0000 UTC m=+147.634499818" watchObservedRunningTime="2026-02-17 15:23:06.106119341 +0000 UTC m=+147.636749752" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.182998 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.183416 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.683384596 +0000 UTC m=+148.214015007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.186733 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hg9q5" podStartSLOduration=126.186714356 podStartE2EDuration="2m6.186714356s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.145968893 +0000 UTC m=+147.676599324" watchObservedRunningTime="2026-02-17 15:23:06.186714356 +0000 UTC m=+147.717344767" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.187239 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pmw47" podStartSLOduration=126.187234128 podStartE2EDuration="2m6.187234128s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.184534143 +0000 UTC m=+147.715164574" watchObservedRunningTime="2026-02-17 15:23:06.187234128 +0000 UTC m=+147.717864539" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.224025 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-d2rfl" podStartSLOduration=126.224009806 podStartE2EDuration="2m6.224009806s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.222356506 +0000 UTC m=+147.752986927" watchObservedRunningTime="2026-02-17 15:23:06.224009806 +0000 UTC m=+147.754640217" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.245390 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w6lrr" podStartSLOduration=126.245363971 podStartE2EDuration="2m6.245363971s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.24406125 +0000 UTC m=+147.774691661" watchObservedRunningTime="2026-02-17 15:23:06.245363971 +0000 UTC m=+147.775994382" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.270203 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" podStartSLOduration=126.27018759 podStartE2EDuration="2m6.27018759s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.269477523 +0000 UTC m=+147.800107944" watchObservedRunningTime="2026-02-17 15:23:06.27018759 +0000 UTC m=+147.800818001" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.284560 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.284722 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.78470179 +0000 UTC m=+148.315332201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.284804 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.285069 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.785061219 +0000 UTC m=+148.315691630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.316545 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zmp4t" podStartSLOduration=126.316524638 podStartE2EDuration="2m6.316524638s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.31412276 +0000 UTC m=+147.844753191" watchObservedRunningTime="2026-02-17 15:23:06.316524638 +0000 UTC m=+147.847155049" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.362271 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-28rqh" podStartSLOduration=126.362256831 podStartE2EDuration="2m6.362256831s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.361423761 +0000 UTC m=+147.892054192" watchObservedRunningTime="2026-02-17 15:23:06.362256831 +0000 UTC m=+147.892887242" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.385392 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.385586 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.885554553 +0000 UTC m=+148.416184974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.385655 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.386063 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.886051805 +0000 UTC m=+148.416682276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.403853 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c4tb8" podStartSLOduration=126.403834044 podStartE2EDuration="2m6.403834044s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:06.402201915 +0000 UTC m=+147.932832336" watchObservedRunningTime="2026-02-17 15:23:06.403834044 +0000 UTC m=+147.934464455" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.487160 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.487344 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.987317789 +0000 UTC m=+148.517948200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.487741 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.488006 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:06.987994555 +0000 UTC m=+148.518624966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.589793 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.590211 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.090197691 +0000 UTC m=+148.620828102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.607351 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:06 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:06 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:06 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.607429 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.616429 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.656136 4806 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bxsmr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.656193 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" podUID="074f20d5-eaad-4185-88d1-fae34a78e015" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.665357 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" event={"ID":"42513bc4-7e15-4e8f-b1aa-006b42a10ff4","Type":"ContainerStarted","Data":"3091cabc341541f3aa6c5b97dbf609d9f3e88b6329e208ee54a4e90208194c37"} Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.666661 4806 patch_prober.go:28] interesting pod/downloads-7954f5f757-clfv7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.666699 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clfv7" podUID="45c1d170-0968-44d2-b9cd-5dcd8732afc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.667153 4806 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6nnnv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.667179 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" podUID="d4c76f4b-80c1-409a-acba-39a9edf0c975" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.672872 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-52z2c" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.682341 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pg2dv" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.692805 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.694052 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.194038616 +0000 UTC m=+148.724669027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.730620 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dsdqr" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.794311 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.794849 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.294812178 +0000 UTC m=+148.825442589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.795236 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.798636 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.298610749 +0000 UTC m=+148.829241160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.883258 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bxsmr" Feb 17 15:23:06 crc kubenswrapper[4806]: I0217 15:23:06.908482 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:06 crc kubenswrapper[4806]: E0217 15:23:06.909224 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.409208028 +0000 UTC m=+148.939838439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.010085 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.010422 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.510393619 +0000 UTC m=+149.041024030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.112096 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.112324 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.612289248 +0000 UTC m=+149.142919659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.112588 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.112947 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.612939563 +0000 UTC m=+149.143569964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.213534 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.213723 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.213765 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.213827 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.213874 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.214389 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.71435754 +0000 UTC m=+149.244987951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.215187 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.220916 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.220935 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.231053 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.315388 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.315793 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.815779707 +0000 UTC m=+149.346410118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.390513 4806 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.416690 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.416910 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.916862076 +0000 UTC m=+149.447492497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.417180 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.417609 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:07.917598134 +0000 UTC m=+149.448228545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.479901 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.488667 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.498030 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.518828 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.519094 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.019061572 +0000 UTC m=+149.549691983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.612666 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:07 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:07 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:07 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.613038 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.622315 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.622638 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.122627331 +0000 UTC m=+149.653257732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.685627 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" event={"ID":"42513bc4-7e15-4e8f-b1aa-006b42a10ff4","Type":"ContainerStarted","Data":"3f397964b6ae12e910541497c36f0185490f04f5063e02a637e984850fda2bdc"} Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.685666 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" event={"ID":"42513bc4-7e15-4e8f-b1aa-006b42a10ff4","Type":"ContainerStarted","Data":"a2778b586c2a81ff70d2e876bcb8a0a5dd10c827954809cf6bfbae9432992892"} Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.722854 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.723163 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.223134916 +0000 UTC m=+149.753765327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.723368 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.725165 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.225155264 +0000 UTC m=+149.755785675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.774130 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gb5xj" podStartSLOduration=10.774105085 podStartE2EDuration="10.774105085s" podCreationTimestamp="2026-02-17 15:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:07.718053223 +0000 UTC m=+149.248683644" watchObservedRunningTime="2026-02-17 15:23:07.774105085 +0000 UTC m=+149.304735506" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.825344 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.825644 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.325629249 +0000 UTC m=+149.856259660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.827168 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bltqq"] Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.828199 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.830289 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.836622 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bltqq"] Feb 17 15:23:07 crc kubenswrapper[4806]: W0217 15:23:07.923226 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-55eba57f8bcae376a7edf92e108e9712bec9f89cd70b37d48b9670d0dfea09e0 WatchSource:0}: Error finding container 55eba57f8bcae376a7edf92e108e9712bec9f89cd70b37d48b9670d0dfea09e0: Status 404 returned error can't find the container with id 55eba57f8bcae376a7edf92e108e9712bec9f89cd70b37d48b9670d0dfea09e0 Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.926766 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-catalog-content\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.926806 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-utilities\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.926839 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5df52\" (UniqueName: \"kubernetes.io/projected/93c131a2-4035-4267-9ed4-a4aef44c7ca5-kube-api-access-5df52\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.926870 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:07 crc kubenswrapper[4806]: E0217 15:23:07.927187 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.427176268 +0000 UTC m=+149.957806679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.997357 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 15:23:07 crc kubenswrapper[4806]: I0217 15:23:07.998393 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.000202 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.000627 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.008367 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.028023 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.028251 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5df52\" (UniqueName: \"kubernetes.io/projected/93c131a2-4035-4267-9ed4-a4aef44c7ca5-kube-api-access-5df52\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.028367 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-catalog-content\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.028395 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-utilities\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.028932 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-utilities\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:08 crc kubenswrapper[4806]: E0217 15:23:08.029021 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.529001545 +0000 UTC m=+150.059631956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.029584 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-catalog-content\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.036365 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xh468"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.037729 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.043953 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.046749 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xh468"] Feb 17 15:23:08 crc kubenswrapper[4806]: W0217 15:23:08.051238 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d01fa6edfc093c232cd7b50042af08b1f48cd3061d1c49717c01998697d7f0bf WatchSource:0}: Error finding container d01fa6edfc093c232cd7b50042af08b1f48cd3061d1c49717c01998697d7f0bf: Status 404 returned error can't find the container with id d01fa6edfc093c232cd7b50042af08b1f48cd3061d1c49717c01998697d7f0bf Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.055051 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5df52\" (UniqueName: \"kubernetes.io/projected/93c131a2-4035-4267-9ed4-a4aef44c7ca5-kube-api-access-5df52\") pod \"certified-operators-bltqq\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.128981 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h777r\" (UniqueName: \"kubernetes.io/projected/141fd58d-8ec4-45ea-af22-89b1c8a0444d-kube-api-access-h777r\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.129039 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-utilities\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.129074 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90aa964a-223b-494e-b429-b63a9dfd7c3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.129139 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-catalog-content\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.129162 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.129188 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90aa964a-223b-494e-b429-b63a9dfd7c3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: E0217 15:23:08.129445 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.629432938 +0000 UTC m=+150.160063349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.143356 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.224741 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qtvt5"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.228453 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.233370 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.233725 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h777r\" (UniqueName: \"kubernetes.io/projected/141fd58d-8ec4-45ea-af22-89b1c8a0444d-kube-api-access-h777r\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.233802 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-utilities\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.233847 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90aa964a-223b-494e-b429-b63a9dfd7c3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.233885 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-catalog-content\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.233939 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90aa964a-223b-494e-b429-b63a9dfd7c3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: E0217 15:23:08.234603 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.734582155 +0000 UTC m=+150.265212576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.235132 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90aa964a-223b-494e-b429-b63a9dfd7c3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.235693 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-utilities\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.235725 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-catalog-content\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.255015 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtvt5"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.265141 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90aa964a-223b-494e-b429-b63a9dfd7c3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.270797 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h777r\" (UniqueName: \"kubernetes.io/projected/141fd58d-8ec4-45ea-af22-89b1c8a0444d-kube-api-access-h777r\") pod \"community-operators-xh468\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.320194 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.335228 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-catalog-content\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.335306 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-utilities\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.335337 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.335358 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rtdp\" (UniqueName: \"kubernetes.io/projected/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-kube-api-access-4rtdp\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: E0217 15:23:08.335792 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 15:23:08.835776127 +0000 UTC m=+150.366406538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8tfm" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.340891 4806 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T15:23:07.39089756Z","Handler":null,"Name":""} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.344224 4806 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.344276 4806 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.355761 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.436216 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.436425 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rtdp\" (UniqueName: \"kubernetes.io/projected/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-kube-api-access-4rtdp\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.436512 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-catalog-content\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.436567 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-utilities\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.437003 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-utilities\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.437588 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-catalog-content\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.438496 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2xq5t"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.443951 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.456374 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xq5t"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.467825 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.471047 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rtdp\" (UniqueName: \"kubernetes.io/projected/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-kube-api-access-4rtdp\") pod \"certified-operators-qtvt5\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.538664 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-catalog-content\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.538739 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-utilities\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.538774 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nck8k\" (UniqueName: \"kubernetes.io/projected/d9989003-88f8-4fe7-889b-28bedb54e8ae-kube-api-access-nck8k\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.538840 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.561696 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.585165 4806 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.585203 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.596049 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bltqq"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.621243 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:08 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:08 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:08 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.621286 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.640152 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-catalog-content\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.640232 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-utilities\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.640525 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nck8k\" (UniqueName: \"kubernetes.io/projected/d9989003-88f8-4fe7-889b-28bedb54e8ae-kube-api-access-nck8k\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.641650 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-catalog-content\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.641948 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-utilities\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.643735 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8tfm\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.658585 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nck8k\" (UniqueName: \"kubernetes.io/projected/d9989003-88f8-4fe7-889b-28bedb54e8ae-kube-api-access-nck8k\") pod \"community-operators-2xq5t\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.701256 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4fc7411e363ae1460da54b739efb26eba2e8fc9cb0852d11acb15a85d3517256"} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.701317 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"067a57b3ef174fdea6a653382377e17d50503d93d3b0b2d08998281b6a11e5de"} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.712524 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"495ce5283cf62d07accca2d52438abd4fa43575621eecf7a132a710b74b9d97a"} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.712587 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d01fa6edfc093c232cd7b50042af08b1f48cd3061d1c49717c01998697d7f0bf"} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.713294 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.715318 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bltqq" event={"ID":"93c131a2-4035-4267-9ed4-a4aef44c7ca5","Type":"ContainerStarted","Data":"7ca10d80a1d69086e006372cf95093e8e75fc2e85b41913075ba1b08ff41b737"} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.717978 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"412caaabf647346ad240ce64146a10e16ab2db9ca3ad452a723720e8af070fa8"} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.718012 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"55eba57f8bcae376a7edf92e108e9712bec9f89cd70b37d48b9670d0dfea09e0"} Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.746486 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.777791 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.826971 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xh468"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.908204 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtvt5"] Feb 17 15:23:08 crc kubenswrapper[4806]: I0217 15:23:08.943650 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:08 crc kubenswrapper[4806]: W0217 15:23:08.974484 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c38b9f_9f1b_42b9_924d_a7f9ff193e73.slice/crio-ebac9a638bc57bbc28551ac85ef3c5f0a00f39adf94d2417a651795b33360951 WatchSource:0}: Error finding container ebac9a638bc57bbc28551ac85ef3c5f0a00f39adf94d2417a651795b33360951: Status 404 returned error can't find the container with id ebac9a638bc57bbc28551ac85ef3c5f0a00f39adf94d2417a651795b33360951 Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.059579 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2xq5t"] Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.194373 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.471819 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8tfm"] Feb 17 15:23:09 crc kubenswrapper[4806]: W0217 15:23:09.479136 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32820688_4037_4b80_8a92_9ebe7068d02e.slice/crio-7bb16b14ca169479b51c8b48af328bec0cd91b7118aff03af3ea2a55b8a181cf WatchSource:0}: Error finding container 7bb16b14ca169479b51c8b48af328bec0cd91b7118aff03af3ea2a55b8a181cf: Status 404 returned error can't find the container with id 7bb16b14ca169479b51c8b48af328bec0cd91b7118aff03af3ea2a55b8a181cf Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.607628 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:09 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:09 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:09 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.607680 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.724054 4806 generic.go:334] "Generic (PLEG): container finished" podID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerID="b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24" exitCode=0 Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.724147 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xq5t" event={"ID":"d9989003-88f8-4fe7-889b-28bedb54e8ae","Type":"ContainerDied","Data":"b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.724557 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xq5t" event={"ID":"d9989003-88f8-4fe7-889b-28bedb54e8ae","Type":"ContainerStarted","Data":"2f5244d398ba1a28d7939ca7c74528048e9efb2b510b2f51f4f20fc6f0de29cc"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.726083 4806 generic.go:334] "Generic (PLEG): container finished" podID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerID="b700a29bc8239f2966dab9aa428a0c70dc52d621e379191fa595a1b5735e683d" exitCode=0 Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.726154 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtvt5" event={"ID":"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73","Type":"ContainerDied","Data":"b700a29bc8239f2966dab9aa428a0c70dc52d621e379191fa595a1b5735e683d"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.726174 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtvt5" event={"ID":"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73","Type":"ContainerStarted","Data":"ebac9a638bc57bbc28551ac85ef3c5f0a00f39adf94d2417a651795b33360951"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.727747 4806 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.728499 4806 generic.go:334] "Generic (PLEG): container finished" podID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerID="0933e94b7c2edca3d4e6af5cb2d433012ef3a2f67ff840a6c7376c71cde3ea88" exitCode=0 Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.728624 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bltqq" event={"ID":"93c131a2-4035-4267-9ed4-a4aef44c7ca5","Type":"ContainerDied","Data":"0933e94b7c2edca3d4e6af5cb2d433012ef3a2f67ff840a6c7376c71cde3ea88"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.731885 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" event={"ID":"32820688-4037-4b80-8a92-9ebe7068d02e","Type":"ContainerStarted","Data":"6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.731931 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" event={"ID":"32820688-4037-4b80-8a92-9ebe7068d02e","Type":"ContainerStarted","Data":"7bb16b14ca169479b51c8b48af328bec0cd91b7118aff03af3ea2a55b8a181cf"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.732016 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.739342 4806 generic.go:334] "Generic (PLEG): container finished" podID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerID="6d622350b309eed3474449f56cca0f3d5eb7abf832691b7e279aee964f80f36a" exitCode=0 Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.739446 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xh468" event={"ID":"141fd58d-8ec4-45ea-af22-89b1c8a0444d","Type":"ContainerDied","Data":"6d622350b309eed3474449f56cca0f3d5eb7abf832691b7e279aee964f80f36a"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.739500 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xh468" event={"ID":"141fd58d-8ec4-45ea-af22-89b1c8a0444d","Type":"ContainerStarted","Data":"2bef659c7e3d0bc29863eba136718b6a3097d5dd68f5ad8ef9939c84d0b8a94b"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.743554 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90aa964a-223b-494e-b429-b63a9dfd7c3f","Type":"ContainerStarted","Data":"2410cbf4f016ac56a41c65eda980add3fa55ff3f068b7ca1b87fb97384421cd2"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.743585 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90aa964a-223b-494e-b429-b63a9dfd7c3f","Type":"ContainerStarted","Data":"0ca7baf906adf5e001f636e6f3a56679cc709c852ca6c247302da4c0c932ba48"} Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.806376 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" podStartSLOduration=129.806357007 podStartE2EDuration="2m9.806357007s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:09.794638004 +0000 UTC m=+151.325268425" watchObservedRunningTime="2026-02-17 15:23:09.806357007 +0000 UTC m=+151.336987418" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.830076 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2t6"] Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.831063 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.835501 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.863118 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5vf\" (UniqueName: \"kubernetes.io/projected/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-kube-api-access-bz5vf\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.863203 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-utilities\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.863233 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-catalog-content\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.893451 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2t6"] Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.934810 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.9347945060000002 podStartE2EDuration="2.934794506s" podCreationTimestamp="2026-02-17 15:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:09.933229248 +0000 UTC m=+151.463859659" watchObservedRunningTime="2026-02-17 15:23:09.934794506 +0000 UTC m=+151.465424917" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.964904 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5vf\" (UniqueName: \"kubernetes.io/projected/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-kube-api-access-bz5vf\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.964999 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-utilities\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.965027 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-catalog-content\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.965620 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-catalog-content\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.966304 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-utilities\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:09 crc kubenswrapper[4806]: I0217 15:23:09.992562 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5vf\" (UniqueName: \"kubernetes.io/projected/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-kube-api-access-bz5vf\") pod \"redhat-marketplace-wn2t6\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.144469 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.144530 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.153922 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.203284 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.241065 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qgs9g"] Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.242562 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.252268 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgs9g"] Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.268324 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gldp\" (UniqueName: \"kubernetes.io/projected/83916ba1-8102-4101-8876-9a4abdc9e48f-kube-api-access-4gldp\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.268714 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-catalog-content\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.268871 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-utilities\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.370065 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gldp\" (UniqueName: \"kubernetes.io/projected/83916ba1-8102-4101-8876-9a4abdc9e48f-kube-api-access-4gldp\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.370471 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-catalog-content\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.370543 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-utilities\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.371317 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-utilities\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.371324 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-catalog-content\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.396486 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gldp\" (UniqueName: \"kubernetes.io/projected/83916ba1-8102-4101-8876-9a4abdc9e48f-kube-api-access-4gldp\") pod \"redhat-marketplace-qgs9g\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.472605 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2t6"] Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.474253 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.474369 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.487944 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.594036 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.603042 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.607123 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:10 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:10 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:10 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.607220 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.627204 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.627309 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.628863 4806 patch_prober.go:28] interesting pod/console-f9d7485db-4sj79 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.629357 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4sj79" podUID="ac7c661c-cf5d-418e-89d3-bc516cabd0e6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.752067 4806 generic.go:334] "Generic (PLEG): container finished" podID="90aa964a-223b-494e-b429-b63a9dfd7c3f" containerID="2410cbf4f016ac56a41c65eda980add3fa55ff3f068b7ca1b87fb97384421cd2" exitCode=0 Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.752148 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90aa964a-223b-494e-b429-b63a9dfd7c3f","Type":"ContainerDied","Data":"2410cbf4f016ac56a41c65eda980add3fa55ff3f068b7ca1b87fb97384421cd2"} Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.756693 4806 generic.go:334] "Generic (PLEG): container finished" podID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerID="209e0c9a2d582d241b879d82849fd6165a9352adc56f653a8ff85c53476501a4" exitCode=0 Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.757026 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2t6" event={"ID":"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49","Type":"ContainerDied","Data":"209e0c9a2d582d241b879d82849fd6165a9352adc56f653a8ff85c53476501a4"} Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.757080 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2t6" event={"ID":"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49","Type":"ContainerStarted","Data":"f77def0fd30b6b5236034357cfef25166a37df61b56eab2f6ae82a7a232b20ec"} Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.765892 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5qkvd" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.769307 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2hntk" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.866694 4806 patch_prober.go:28] interesting pod/downloads-7954f5f757-clfv7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.866755 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-clfv7" podUID="45c1d170-0968-44d2-b9cd-5dcd8732afc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.867268 4806 patch_prober.go:28] interesting pod/downloads-7954f5f757-clfv7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.867327 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-clfv7" podUID="45c1d170-0968-44d2-b9cd-5dcd8732afc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 17 15:23:10 crc kubenswrapper[4806]: I0217 15:23:10.925748 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgs9g"] Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.224671 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7498n"] Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.226172 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.231007 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.239841 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7498n"] Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.293563 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-catalog-content\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.293727 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqvn\" (UniqueName: \"kubernetes.io/projected/81712631-a6c3-4b56-ad8c-dd51bc0d217b-kube-api-access-4pqvn\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.293794 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-utilities\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.327123 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.396702 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqvn\" (UniqueName: \"kubernetes.io/projected/81712631-a6c3-4b56-ad8c-dd51bc0d217b-kube-api-access-4pqvn\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.396810 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-utilities\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.396867 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-catalog-content\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.397414 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-catalog-content\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.398897 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-utilities\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.462508 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqvn\" (UniqueName: \"kubernetes.io/projected/81712631-a6c3-4b56-ad8c-dd51bc0d217b-kube-api-access-4pqvn\") pod \"redhat-operators-7498n\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.559376 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.606253 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:11 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:11 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:11 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.606314 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.623507 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sbjk5"] Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.624807 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.632095 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sbjk5"] Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.704262 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-catalog-content\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.704712 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/bf7442d4-1501-4230-98c0-d54f138f7c85-kube-api-access-ff5md\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.704775 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-utilities\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.805939 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-catalog-content\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.806004 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/bf7442d4-1501-4230-98c0-d54f138f7c85-kube-api-access-ff5md\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.806060 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-utilities\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.806626 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-utilities\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.806826 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-catalog-content\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.829056 4806 generic.go:334] "Generic (PLEG): container finished" podID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerID="6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4" exitCode=0 Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.829168 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgs9g" event={"ID":"83916ba1-8102-4101-8876-9a4abdc9e48f","Type":"ContainerDied","Data":"6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4"} Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.829198 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgs9g" event={"ID":"83916ba1-8102-4101-8876-9a4abdc9e48f","Type":"ContainerStarted","Data":"170b588d58cd5a2b09fb847252a6549f88d2b42bad4f35eabafcf75fd8ae4075"} Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.836214 4806 generic.go:334] "Generic (PLEG): container finished" podID="bea7524e-6205-4b23-bec9-028f0ebe3cf2" containerID="27e3234a8628a98b6d139ce43340f30f13fa246c72f77f84c06df26f6fdb9d12" exitCode=0 Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.836709 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" event={"ID":"bea7524e-6205-4b23-bec9-028f0ebe3cf2","Type":"ContainerDied","Data":"27e3234a8628a98b6d139ce43340f30f13fa246c72f77f84c06df26f6fdb9d12"} Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.849595 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/bf7442d4-1501-4230-98c0-d54f138f7c85-kube-api-access-ff5md\") pod \"redhat-operators-sbjk5\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.932437 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7498n"] Feb 17 15:23:11 crc kubenswrapper[4806]: I0217 15:23:11.986853 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.225603 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.406744 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sbjk5"] Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.414450 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90aa964a-223b-494e-b429-b63a9dfd7c3f-kube-api-access\") pod \"90aa964a-223b-494e-b429-b63a9dfd7c3f\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.414596 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90aa964a-223b-494e-b429-b63a9dfd7c3f-kubelet-dir\") pod \"90aa964a-223b-494e-b429-b63a9dfd7c3f\" (UID: \"90aa964a-223b-494e-b429-b63a9dfd7c3f\") " Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.414677 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90aa964a-223b-494e-b429-b63a9dfd7c3f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90aa964a-223b-494e-b429-b63a9dfd7c3f" (UID: "90aa964a-223b-494e-b429-b63a9dfd7c3f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.414874 4806 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90aa964a-223b-494e-b429-b63a9dfd7c3f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:12 crc kubenswrapper[4806]: W0217 15:23:12.422100 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf7442d4_1501_4230_98c0_d54f138f7c85.slice/crio-a247183775654341e03a1ea5d35ebb5c9bdec57b5f9981a8393b3fd3ff639f03 WatchSource:0}: Error finding container a247183775654341e03a1ea5d35ebb5c9bdec57b5f9981a8393b3fd3ff639f03: Status 404 returned error can't find the container with id a247183775654341e03a1ea5d35ebb5c9bdec57b5f9981a8393b3fd3ff639f03 Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.423911 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90aa964a-223b-494e-b429-b63a9dfd7c3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90aa964a-223b-494e-b429-b63a9dfd7c3f" (UID: "90aa964a-223b-494e-b429-b63a9dfd7c3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.517073 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90aa964a-223b-494e-b429-b63a9dfd7c3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.611265 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:12 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:12 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:12 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.611327 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.847783 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90aa964a-223b-494e-b429-b63a9dfd7c3f","Type":"ContainerDied","Data":"0ca7baf906adf5e001f636e6f3a56679cc709c852ca6c247302da4c0c932ba48"} Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.847823 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ca7baf906adf5e001f636e6f3a56679cc709c852ca6c247302da4c0c932ba48" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.847803 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.860910 4806 generic.go:334] "Generic (PLEG): container finished" podID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerID="2354912bf2ed3ec46ea8c9d827576fef0cab676c5cd7801d1edccc5f8c29cfda" exitCode=0 Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.861241 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7498n" event={"ID":"81712631-a6c3-4b56-ad8c-dd51bc0d217b","Type":"ContainerDied","Data":"2354912bf2ed3ec46ea8c9d827576fef0cab676c5cd7801d1edccc5f8c29cfda"} Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.861307 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7498n" event={"ID":"81712631-a6c3-4b56-ad8c-dd51bc0d217b","Type":"ContainerStarted","Data":"5c8ff183b6c8012de20852db3b3250d3636c50524846ac7e731f654cc57375a4"} Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.877312 4806 generic.go:334] "Generic (PLEG): container finished" podID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerID="d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1" exitCode=0 Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.877439 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbjk5" event={"ID":"bf7442d4-1501-4230-98c0-d54f138f7c85","Type":"ContainerDied","Data":"d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1"} Feb 17 15:23:12 crc kubenswrapper[4806]: I0217 15:23:12.877523 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbjk5" event={"ID":"bf7442d4-1501-4230-98c0-d54f138f7c85","Type":"ContainerStarted","Data":"a247183775654341e03a1ea5d35ebb5c9bdec57b5f9981a8393b3fd3ff639f03"} Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.154948 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.330961 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd2lb\" (UniqueName: \"kubernetes.io/projected/bea7524e-6205-4b23-bec9-028f0ebe3cf2-kube-api-access-zd2lb\") pod \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.331120 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea7524e-6205-4b23-bec9-028f0ebe3cf2-config-volume\") pod \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.331201 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea7524e-6205-4b23-bec9-028f0ebe3cf2-secret-volume\") pod \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\" (UID: \"bea7524e-6205-4b23-bec9-028f0ebe3cf2\") " Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.332036 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea7524e-6205-4b23-bec9-028f0ebe3cf2-config-volume" (OuterVolumeSpecName: "config-volume") pod "bea7524e-6205-4b23-bec9-028f0ebe3cf2" (UID: "bea7524e-6205-4b23-bec9-028f0ebe3cf2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.335908 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea7524e-6205-4b23-bec9-028f0ebe3cf2-kube-api-access-zd2lb" (OuterVolumeSpecName: "kube-api-access-zd2lb") pod "bea7524e-6205-4b23-bec9-028f0ebe3cf2" (UID: "bea7524e-6205-4b23-bec9-028f0ebe3cf2"). InnerVolumeSpecName "kube-api-access-zd2lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.335991 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bea7524e-6205-4b23-bec9-028f0ebe3cf2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bea7524e-6205-4b23-bec9-028f0ebe3cf2" (UID: "bea7524e-6205-4b23-bec9-028f0ebe3cf2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.432477 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd2lb\" (UniqueName: \"kubernetes.io/projected/bea7524e-6205-4b23-bec9-028f0ebe3cf2-kube-api-access-zd2lb\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.432512 4806 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bea7524e-6205-4b23-bec9-028f0ebe3cf2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.432521 4806 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bea7524e-6205-4b23-bec9-028f0ebe3cf2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.605840 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:13 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:13 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:13 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.605900 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.898454 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" event={"ID":"bea7524e-6205-4b23-bec9-028f0ebe3cf2","Type":"ContainerDied","Data":"a810f17960e8547161fa111e387a83fbcb8fc181aed2fa074761a9b17937b9c7"} Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.898610 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522355-mn87x" Feb 17 15:23:13 crc kubenswrapper[4806]: I0217 15:23:13.898687 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a810f17960e8547161fa111e387a83fbcb8fc181aed2fa074761a9b17937b9c7" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.080019 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 15:23:14 crc kubenswrapper[4806]: E0217 15:23:14.080562 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea7524e-6205-4b23-bec9-028f0ebe3cf2" containerName="collect-profiles" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.080576 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea7524e-6205-4b23-bec9-028f0ebe3cf2" containerName="collect-profiles" Feb 17 15:23:14 crc kubenswrapper[4806]: E0217 15:23:14.080598 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90aa964a-223b-494e-b429-b63a9dfd7c3f" containerName="pruner" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.080604 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="90aa964a-223b-494e-b429-b63a9dfd7c3f" containerName="pruner" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.080710 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="90aa964a-223b-494e-b429-b63a9dfd7c3f" containerName="pruner" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.080722 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea7524e-6205-4b23-bec9-028f0ebe3cf2" containerName="collect-profiles" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.081187 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.083918 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.084892 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.085250 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.142968 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e63f59a6-0c8c-4466-b691-dcda57e6b729-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.143099 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e63f59a6-0c8c-4466-b691-dcda57e6b729-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.243983 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e63f59a6-0c8c-4466-b691-dcda57e6b729-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.244173 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e63f59a6-0c8c-4466-b691-dcda57e6b729-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.244273 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e63f59a6-0c8c-4466-b691-dcda57e6b729-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.279317 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e63f59a6-0c8c-4466-b691-dcda57e6b729-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.419930 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.608168 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:14 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:14 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:14 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.608594 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:14 crc kubenswrapper[4806]: I0217 15:23:14.991543 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 15:23:15 crc kubenswrapper[4806]: W0217 15:23:15.071709 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode63f59a6_0c8c_4466_b691_dcda57e6b729.slice/crio-cbda0b756d387ab741165154762023a7f23083d8ab2a92d363ee31675a57b53c WatchSource:0}: Error finding container cbda0b756d387ab741165154762023a7f23083d8ab2a92d363ee31675a57b53c: Status 404 returned error can't find the container with id cbda0b756d387ab741165154762023a7f23083d8ab2a92d363ee31675a57b53c Feb 17 15:23:15 crc kubenswrapper[4806]: I0217 15:23:15.606956 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:15 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:15 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:15 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:15 crc kubenswrapper[4806]: I0217 15:23:15.608511 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:15 crc kubenswrapper[4806]: I0217 15:23:15.989217 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e63f59a6-0c8c-4466-b691-dcda57e6b729","Type":"ContainerStarted","Data":"2e08040c8b2a080c391319ad6af60ae1da7032c2a66c78425485d0652409b0f3"} Feb 17 15:23:15 crc kubenswrapper[4806]: I0217 15:23:15.989268 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e63f59a6-0c8c-4466-b691-dcda57e6b729","Type":"ContainerStarted","Data":"cbda0b756d387ab741165154762023a7f23083d8ab2a92d363ee31675a57b53c"} Feb 17 15:23:16 crc kubenswrapper[4806]: I0217 15:23:16.008049 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.008033076 podStartE2EDuration="2.008033076s" podCreationTimestamp="2026-02-17 15:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:16.007574855 +0000 UTC m=+157.538205286" watchObservedRunningTime="2026-02-17 15:23:16.008033076 +0000 UTC m=+157.538663487" Feb 17 15:23:16 crc kubenswrapper[4806]: I0217 15:23:16.076885 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5nlh8" Feb 17 15:23:16 crc kubenswrapper[4806]: I0217 15:23:16.605605 4806 patch_prober.go:28] interesting pod/router-default-5444994796-r4ldh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 15:23:16 crc kubenswrapper[4806]: [-]has-synced failed: reason withheld Feb 17 15:23:16 crc kubenswrapper[4806]: [+]process-running ok Feb 17 15:23:16 crc kubenswrapper[4806]: healthz check failed Feb 17 15:23:16 crc kubenswrapper[4806]: I0217 15:23:16.606036 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r4ldh" podUID="6be3fc8f-849e-4d01-948a-46bc9ca06a05" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 15:23:17 crc kubenswrapper[4806]: I0217 15:23:17.019156 4806 generic.go:334] "Generic (PLEG): container finished" podID="e63f59a6-0c8c-4466-b691-dcda57e6b729" containerID="2e08040c8b2a080c391319ad6af60ae1da7032c2a66c78425485d0652409b0f3" exitCode=0 Feb 17 15:23:17 crc kubenswrapper[4806]: I0217 15:23:17.019204 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e63f59a6-0c8c-4466-b691-dcda57e6b729","Type":"ContainerDied","Data":"2e08040c8b2a080c391319ad6af60ae1da7032c2a66c78425485d0652409b0f3"} Feb 17 15:23:17 crc kubenswrapper[4806]: I0217 15:23:17.607596 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:17 crc kubenswrapper[4806]: I0217 15:23:17.610054 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r4ldh" Feb 17 15:23:20 crc kubenswrapper[4806]: I0217 15:23:20.640778 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:20 crc kubenswrapper[4806]: I0217 15:23:20.646367 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4sj79" Feb 17 15:23:20 crc kubenswrapper[4806]: I0217 15:23:20.870613 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-clfv7" Feb 17 15:23:22 crc kubenswrapper[4806]: I0217 15:23:22.807296 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:23:22 crc kubenswrapper[4806]: I0217 15:23:22.829328 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5af69f46-757a-4fab-adbd-d7a278868c94-metrics-certs\") pod \"network-metrics-daemon-h72qm\" (UID: \"5af69f46-757a-4fab-adbd-d7a278868c94\") " pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:23:23 crc kubenswrapper[4806]: I0217 15:23:23.104982 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h72qm" Feb 17 15:23:24 crc kubenswrapper[4806]: I0217 15:23:24.574672 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:24 crc kubenswrapper[4806]: I0217 15:23:24.735600 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e63f59a6-0c8c-4466-b691-dcda57e6b729-kubelet-dir\") pod \"e63f59a6-0c8c-4466-b691-dcda57e6b729\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " Feb 17 15:23:24 crc kubenswrapper[4806]: I0217 15:23:24.735718 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e63f59a6-0c8c-4466-b691-dcda57e6b729-kube-api-access\") pod \"e63f59a6-0c8c-4466-b691-dcda57e6b729\" (UID: \"e63f59a6-0c8c-4466-b691-dcda57e6b729\") " Feb 17 15:23:24 crc kubenswrapper[4806]: I0217 15:23:24.735911 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e63f59a6-0c8c-4466-b691-dcda57e6b729-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e63f59a6-0c8c-4466-b691-dcda57e6b729" (UID: "e63f59a6-0c8c-4466-b691-dcda57e6b729"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:23:24 crc kubenswrapper[4806]: I0217 15:23:24.736368 4806 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e63f59a6-0c8c-4466-b691-dcda57e6b729-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:24 crc kubenswrapper[4806]: I0217 15:23:24.753051 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63f59a6-0c8c-4466-b691-dcda57e6b729-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e63f59a6-0c8c-4466-b691-dcda57e6b729" (UID: "e63f59a6-0c8c-4466-b691-dcda57e6b729"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:24 crc kubenswrapper[4806]: I0217 15:23:24.838053 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e63f59a6-0c8c-4466-b691-dcda57e6b729-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:25 crc kubenswrapper[4806]: I0217 15:23:25.121206 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e63f59a6-0c8c-4466-b691-dcda57e6b729","Type":"ContainerDied","Data":"cbda0b756d387ab741165154762023a7f23083d8ab2a92d363ee31675a57b53c"} Feb 17 15:23:25 crc kubenswrapper[4806]: I0217 15:23:25.121268 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbda0b756d387ab741165154762023a7f23083d8ab2a92d363ee31675a57b53c" Feb 17 15:23:25 crc kubenswrapper[4806]: I0217 15:23:25.121357 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 15:23:28 crc kubenswrapper[4806]: I0217 15:23:28.950487 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:23:34 crc kubenswrapper[4806]: I0217 15:23:34.785045 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:23:34 crc kubenswrapper[4806]: I0217 15:23:34.785832 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.113425 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h72qm"] Feb 17 15:23:38 crc kubenswrapper[4806]: W0217 15:23:38.125275 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af69f46_757a_4fab_adbd_d7a278868c94.slice/crio-20b763ee8fef1e4a41984f02f32e86ba677416fbed31c3f04d622b0174f1bacc WatchSource:0}: Error finding container 20b763ee8fef1e4a41984f02f32e86ba677416fbed31c3f04d622b0174f1bacc: Status 404 returned error can't find the container with id 20b763ee8fef1e4a41984f02f32e86ba677416fbed31c3f04d622b0174f1bacc Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.221506 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtvt5" event={"ID":"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73","Type":"ContainerStarted","Data":"6d498f45f49b9eca47e83a7f05f831b9cfab063e9b70c1fac673aced4c5e1046"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.227538 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbjk5" event={"ID":"bf7442d4-1501-4230-98c0-d54f138f7c85","Type":"ContainerStarted","Data":"9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.239085 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xh468" event={"ID":"141fd58d-8ec4-45ea-af22-89b1c8a0444d","Type":"ContainerStarted","Data":"d7155ebb706bf7ec255b59c69e912fc63667e5e833c199b65b04589b73f3efd5"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.241774 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7498n" event={"ID":"81712631-a6c3-4b56-ad8c-dd51bc0d217b","Type":"ContainerStarted","Data":"503555c017cda12438fff52458b83b67a835cab3b78623fc03ebf231e9f9aae0"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.254160 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xq5t" event={"ID":"d9989003-88f8-4fe7-889b-28bedb54e8ae","Type":"ContainerStarted","Data":"2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.262875 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bltqq" event={"ID":"93c131a2-4035-4267-9ed4-a4aef44c7ca5","Type":"ContainerStarted","Data":"0f9f3aa3c58fd6672a6c55a0bc1c67036d03fbbf0a31dbd427d7cb37d4576fa4"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.265220 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2t6" event={"ID":"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49","Type":"ContainerStarted","Data":"6f5add4e9acdf6facf060af3f84cdb9b6eec83f5d64e3ab56be0b5f9431b6af6"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.267383 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h72qm" event={"ID":"5af69f46-757a-4fab-adbd-d7a278868c94","Type":"ContainerStarted","Data":"20b763ee8fef1e4a41984f02f32e86ba677416fbed31c3f04d622b0174f1bacc"} Feb 17 15:23:38 crc kubenswrapper[4806]: I0217 15:23:38.273656 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgs9g" event={"ID":"83916ba1-8102-4101-8876-9a4abdc9e48f","Type":"ContainerStarted","Data":"7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.283103 4806 generic.go:334] "Generic (PLEG): container finished" podID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerID="2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.283474 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xq5t" event={"ID":"d9989003-88f8-4fe7-889b-28bedb54e8ae","Type":"ContainerDied","Data":"2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.297920 4806 generic.go:334] "Generic (PLEG): container finished" podID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerID="d7155ebb706bf7ec255b59c69e912fc63667e5e833c199b65b04589b73f3efd5" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.297989 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xh468" event={"ID":"141fd58d-8ec4-45ea-af22-89b1c8a0444d","Type":"ContainerDied","Data":"d7155ebb706bf7ec255b59c69e912fc63667e5e833c199b65b04589b73f3efd5"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.304296 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h72qm" event={"ID":"5af69f46-757a-4fab-adbd-d7a278868c94","Type":"ContainerStarted","Data":"3c1d7e0a0397e7b5ad76d44d806f39aa0af1c14f5beff5c9cf1a7873008229ca"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.310158 4806 generic.go:334] "Generic (PLEG): container finished" podID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerID="7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.310298 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgs9g" event={"ID":"83916ba1-8102-4101-8876-9a4abdc9e48f","Type":"ContainerDied","Data":"7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.326106 4806 generic.go:334] "Generic (PLEG): container finished" podID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerID="9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.326176 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbjk5" event={"ID":"bf7442d4-1501-4230-98c0-d54f138f7c85","Type":"ContainerDied","Data":"9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.333219 4806 generic.go:334] "Generic (PLEG): container finished" podID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerID="0f9f3aa3c58fd6672a6c55a0bc1c67036d03fbbf0a31dbd427d7cb37d4576fa4" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.333291 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bltqq" event={"ID":"93c131a2-4035-4267-9ed4-a4aef44c7ca5","Type":"ContainerDied","Data":"0f9f3aa3c58fd6672a6c55a0bc1c67036d03fbbf0a31dbd427d7cb37d4576fa4"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.339642 4806 generic.go:334] "Generic (PLEG): container finished" podID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerID="6f5add4e9acdf6facf060af3f84cdb9b6eec83f5d64e3ab56be0b5f9431b6af6" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.339704 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2t6" event={"ID":"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49","Type":"ContainerDied","Data":"6f5add4e9acdf6facf060af3f84cdb9b6eec83f5d64e3ab56be0b5f9431b6af6"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.343438 4806 generic.go:334] "Generic (PLEG): container finished" podID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerID="503555c017cda12438fff52458b83b67a835cab3b78623fc03ebf231e9f9aae0" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.343506 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7498n" event={"ID":"81712631-a6c3-4b56-ad8c-dd51bc0d217b","Type":"ContainerDied","Data":"503555c017cda12438fff52458b83b67a835cab3b78623fc03ebf231e9f9aae0"} Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.345334 4806 generic.go:334] "Generic (PLEG): container finished" podID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerID="6d498f45f49b9eca47e83a7f05f831b9cfab063e9b70c1fac673aced4c5e1046" exitCode=0 Feb 17 15:23:39 crc kubenswrapper[4806]: I0217 15:23:39.345362 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtvt5" event={"ID":"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73","Type":"ContainerDied","Data":"6d498f45f49b9eca47e83a7f05f831b9cfab063e9b70c1fac673aced4c5e1046"} Feb 17 15:23:40 crc kubenswrapper[4806]: I0217 15:23:40.353190 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h72qm" event={"ID":"5af69f46-757a-4fab-adbd-d7a278868c94","Type":"ContainerStarted","Data":"5e6308ae8c9322d0728f72c71f3af16e036c3993a2d2221fe933b2c9554baa06"} Feb 17 15:23:40 crc kubenswrapper[4806]: I0217 15:23:40.375678 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h72qm" podStartSLOduration=160.375660889 podStartE2EDuration="2m40.375660889s" podCreationTimestamp="2026-02-17 15:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:40.372957383 +0000 UTC m=+181.903587814" watchObservedRunningTime="2026-02-17 15:23:40.375660889 +0000 UTC m=+181.906291300" Feb 17 15:23:41 crc kubenswrapper[4806]: I0217 15:23:41.104960 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tbmdq" Feb 17 15:23:42 crc kubenswrapper[4806]: I0217 15:23:42.369045 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtvt5" event={"ID":"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73","Type":"ContainerStarted","Data":"757b5266445a624483d6427b85977bc2d1ec3e3b1153ba49d95ec406f63b5971"} Feb 17 15:23:42 crc kubenswrapper[4806]: I0217 15:23:42.371671 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2t6" event={"ID":"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49","Type":"ContainerStarted","Data":"615b776fb0db2468a2b330b6922f07ec8a2beaa0be871285ae3d7bede2e7d17e"} Feb 17 15:23:42 crc kubenswrapper[4806]: I0217 15:23:42.391529 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wn2t6" podStartSLOduration=3.0771014230000002 podStartE2EDuration="33.391510305s" podCreationTimestamp="2026-02-17 15:23:09 +0000 UTC" firstStartedPulling="2026-02-17 15:23:10.758591512 +0000 UTC m=+152.289221933" lastFinishedPulling="2026-02-17 15:23:41.073000404 +0000 UTC m=+182.603630815" observedRunningTime="2026-02-17 15:23:42.390149132 +0000 UTC m=+183.920779553" watchObservedRunningTime="2026-02-17 15:23:42.391510305 +0000 UTC m=+183.922140716" Feb 17 15:23:43 crc kubenswrapper[4806]: I0217 15:23:43.379214 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xq5t" event={"ID":"d9989003-88f8-4fe7-889b-28bedb54e8ae","Type":"ContainerStarted","Data":"696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36"} Feb 17 15:23:43 crc kubenswrapper[4806]: I0217 15:23:43.397112 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qtvt5" podStartSLOduration=3.225021466 podStartE2EDuration="35.397093507s" podCreationTimestamp="2026-02-17 15:23:08 +0000 UTC" firstStartedPulling="2026-02-17 15:23:09.727604527 +0000 UTC m=+151.258234938" lastFinishedPulling="2026-02-17 15:23:41.899676568 +0000 UTC m=+183.430306979" observedRunningTime="2026-02-17 15:23:43.395435837 +0000 UTC m=+184.926066268" watchObservedRunningTime="2026-02-17 15:23:43.397093507 +0000 UTC m=+184.927723918" Feb 17 15:23:44 crc kubenswrapper[4806]: I0217 15:23:44.405084 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2xq5t" podStartSLOduration=3.234615997 podStartE2EDuration="36.405055946s" podCreationTimestamp="2026-02-17 15:23:08 +0000 UTC" firstStartedPulling="2026-02-17 15:23:09.727468574 +0000 UTC m=+151.258098985" lastFinishedPulling="2026-02-17 15:23:42.897908523 +0000 UTC m=+184.428538934" observedRunningTime="2026-02-17 15:23:44.400821944 +0000 UTC m=+185.931452355" watchObservedRunningTime="2026-02-17 15:23:44.405055946 +0000 UTC m=+185.935686347" Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.397565 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgs9g" event={"ID":"83916ba1-8102-4101-8876-9a4abdc9e48f","Type":"ContainerStarted","Data":"2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61"} Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.400132 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbjk5" event={"ID":"bf7442d4-1501-4230-98c0-d54f138f7c85","Type":"ContainerStarted","Data":"6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872"} Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.402063 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bltqq" event={"ID":"93c131a2-4035-4267-9ed4-a4aef44c7ca5","Type":"ContainerStarted","Data":"5e95a73cb1296494acce3298ca1c907495708b2653095089e700c2e631711ee0"} Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.405801 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xh468" event={"ID":"141fd58d-8ec4-45ea-af22-89b1c8a0444d","Type":"ContainerStarted","Data":"ea668d50c411e7747cf3613452948a4c600948cc8df0a185c52f26eab8ac129b"} Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.408085 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7498n" event={"ID":"81712631-a6c3-4b56-ad8c-dd51bc0d217b","Type":"ContainerStarted","Data":"24deb8ff2db3325f36a4451ede2f791743682a27eb66a744fdc7467de6c8ce5a"} Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.416186 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qgs9g" podStartSLOduration=2.314542542 podStartE2EDuration="35.416172891s" podCreationTimestamp="2026-02-17 15:23:10 +0000 UTC" firstStartedPulling="2026-02-17 15:23:11.834936811 +0000 UTC m=+153.365567222" lastFinishedPulling="2026-02-17 15:23:44.93656716 +0000 UTC m=+186.467197571" observedRunningTime="2026-02-17 15:23:45.414566092 +0000 UTC m=+186.945196523" watchObservedRunningTime="2026-02-17 15:23:45.416172891 +0000 UTC m=+186.946803302" Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.428885 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7498n" podStartSLOduration=2.32453534 podStartE2EDuration="34.428871187s" podCreationTimestamp="2026-02-17 15:23:11 +0000 UTC" firstStartedPulling="2026-02-17 15:23:12.862436932 +0000 UTC m=+154.393067343" lastFinishedPulling="2026-02-17 15:23:44.966772779 +0000 UTC m=+186.497403190" observedRunningTime="2026-02-17 15:23:45.428361375 +0000 UTC m=+186.958991796" watchObservedRunningTime="2026-02-17 15:23:45.428871187 +0000 UTC m=+186.959501588" Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.445481 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xh468" podStartSLOduration=2.985902965 podStartE2EDuration="37.445461277s" podCreationTimestamp="2026-02-17 15:23:08 +0000 UTC" firstStartedPulling="2026-02-17 15:23:09.741486902 +0000 UTC m=+151.272117313" lastFinishedPulling="2026-02-17 15:23:44.201045214 +0000 UTC m=+185.731675625" observedRunningTime="2026-02-17 15:23:45.444645718 +0000 UTC m=+186.975276129" watchObservedRunningTime="2026-02-17 15:23:45.445461277 +0000 UTC m=+186.976091678" Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.470521 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bltqq" podStartSLOduration=3.136203864 podStartE2EDuration="38.470494091s" podCreationTimestamp="2026-02-17 15:23:07 +0000 UTC" firstStartedPulling="2026-02-17 15:23:09.729823361 +0000 UTC m=+151.260453772" lastFinishedPulling="2026-02-17 15:23:45.064113588 +0000 UTC m=+186.594743999" observedRunningTime="2026-02-17 15:23:45.469150129 +0000 UTC m=+186.999780540" watchObservedRunningTime="2026-02-17 15:23:45.470494091 +0000 UTC m=+187.001124502" Feb 17 15:23:45 crc kubenswrapper[4806]: I0217 15:23:45.496869 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sbjk5" podStartSLOduration=2.227194792 podStartE2EDuration="34.496853467s" podCreationTimestamp="2026-02-17 15:23:11 +0000 UTC" firstStartedPulling="2026-02-17 15:23:12.886611456 +0000 UTC m=+154.417241867" lastFinishedPulling="2026-02-17 15:23:45.156270131 +0000 UTC m=+186.686900542" observedRunningTime="2026-02-17 15:23:45.494100451 +0000 UTC m=+187.024730872" watchObservedRunningTime="2026-02-17 15:23:45.496853467 +0000 UTC m=+187.027483878" Feb 17 15:23:47 crc kubenswrapper[4806]: I0217 15:23:47.494116 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.144786 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.147692 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.292097 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.358604 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.358666 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.404741 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.563351 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.563391 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.605270 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.778464 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.778517 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:48 crc kubenswrapper[4806]: I0217 15:23:48.812868 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:49 crc kubenswrapper[4806]: I0217 15:23:49.127001 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dp9cm"] Feb 17 15:23:49 crc kubenswrapper[4806]: I0217 15:23:49.471162 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:49 crc kubenswrapper[4806]: I0217 15:23:49.506886 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:50 crc kubenswrapper[4806]: I0217 15:23:50.203857 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:50 crc kubenswrapper[4806]: I0217 15:23:50.203941 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:50 crc kubenswrapper[4806]: I0217 15:23:50.252731 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:50 crc kubenswrapper[4806]: I0217 15:23:50.533044 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:23:50 crc kubenswrapper[4806]: I0217 15:23:50.594279 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:50 crc kubenswrapper[4806]: I0217 15:23:50.594335 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:50 crc kubenswrapper[4806]: I0217 15:23:50.641621 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.484131 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.559976 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.560036 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.640822 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qtvt5"] Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.641154 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qtvt5" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="registry-server" containerID="cri-o://757b5266445a624483d6427b85977bc2d1ec3e3b1153ba49d95ec406f63b5971" gracePeriod=2 Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.871218 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 15:23:51 crc kubenswrapper[4806]: E0217 15:23:51.871506 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63f59a6-0c8c-4466-b691-dcda57e6b729" containerName="pruner" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.871522 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63f59a6-0c8c-4466-b691-dcda57e6b729" containerName="pruner" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.871656 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63f59a6-0c8c-4466-b691-dcda57e6b729" containerName="pruner" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.872140 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.874646 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.874881 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.881647 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.971583 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53265568-e6c8-4d2c-8902-b6c65fdff13a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.971882 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53265568-e6c8-4d2c-8902-b6c65fdff13a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.987777 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:51 crc kubenswrapper[4806]: I0217 15:23:51.987935 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.026115 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.080116 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53265568-e6c8-4d2c-8902-b6c65fdff13a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.080209 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53265568-e6c8-4d2c-8902-b6c65fdff13a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.080297 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53265568-e6c8-4d2c-8902-b6c65fdff13a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.111504 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53265568-e6c8-4d2c-8902-b6c65fdff13a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.250091 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.447556 4806 generic.go:334] "Generic (PLEG): container finished" podID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerID="757b5266445a624483d6427b85977bc2d1ec3e3b1153ba49d95ec406f63b5971" exitCode=0 Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.447969 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtvt5" event={"ID":"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73","Type":"ContainerDied","Data":"757b5266445a624483d6427b85977bc2d1ec3e3b1153ba49d95ec406f63b5971"} Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.502302 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.607196 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7498n" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="registry-server" probeResult="failure" output=< Feb 17 15:23:52 crc kubenswrapper[4806]: timeout: failed to connect service ":50051" within 1s Feb 17 15:23:52 crc kubenswrapper[4806]: > Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.645786 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.652359 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.822648 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-utilities\") pod \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.822722 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-catalog-content\") pod \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.822764 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rtdp\" (UniqueName: \"kubernetes.io/projected/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-kube-api-access-4rtdp\") pod \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\" (UID: \"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73\") " Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.823543 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-utilities" (OuterVolumeSpecName: "utilities") pod "c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" (UID: "c5c38b9f-9f1b-42b9-924d-a7f9ff193e73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.827589 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-kube-api-access-4rtdp" (OuterVolumeSpecName: "kube-api-access-4rtdp") pod "c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" (UID: "c5c38b9f-9f1b-42b9-924d-a7f9ff193e73"). InnerVolumeSpecName "kube-api-access-4rtdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.892955 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" (UID: "c5c38b9f-9f1b-42b9-924d-a7f9ff193e73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.924347 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.924375 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:52 crc kubenswrapper[4806]: I0217 15:23:52.924388 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rtdp\" (UniqueName: \"kubernetes.io/projected/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73-kube-api-access-4rtdp\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.441941 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xq5t"] Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.442641 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2xq5t" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="registry-server" containerID="cri-o://696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36" gracePeriod=2 Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.457039 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"53265568-e6c8-4d2c-8902-b6c65fdff13a","Type":"ContainerStarted","Data":"2bb6d25297d7a0a5f33057b3582f7164b7c859fb7665b01f92db39383ffa99e5"} Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.457093 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"53265568-e6c8-4d2c-8902-b6c65fdff13a","Type":"ContainerStarted","Data":"c355c9fe3ace900bf02945a7201a129c559c965e53d0863742badbd44f1ec45d"} Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.462043 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtvt5" Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.462089 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtvt5" event={"ID":"c5c38b9f-9f1b-42b9-924d-a7f9ff193e73","Type":"ContainerDied","Data":"ebac9a638bc57bbc28551ac85ef3c5f0a00f39adf94d2417a651795b33360951"} Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.462117 4806 scope.go:117] "RemoveContainer" containerID="757b5266445a624483d6427b85977bc2d1ec3e3b1153ba49d95ec406f63b5971" Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.475485 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.4754310569999998 podStartE2EDuration="2.475431057s" podCreationTimestamp="2026-02-17 15:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:53.472814874 +0000 UTC m=+195.003445315" watchObservedRunningTime="2026-02-17 15:23:53.475431057 +0000 UTC m=+195.006061458" Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.487928 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qtvt5"] Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.490702 4806 scope.go:117] "RemoveContainer" containerID="6d498f45f49b9eca47e83a7f05f831b9cfab063e9b70c1fac673aced4c5e1046" Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.495993 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qtvt5"] Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.557973 4806 scope.go:117] "RemoveContainer" containerID="b700a29bc8239f2966dab9aa428a0c70dc52d621e379191fa595a1b5735e683d" Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.838682 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:53 crc kubenswrapper[4806]: I0217 15:23:53.938957 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-catalog-content\") pod \"d9989003-88f8-4fe7-889b-28bedb54e8ae\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.005860 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9989003-88f8-4fe7-889b-28bedb54e8ae" (UID: "d9989003-88f8-4fe7-889b-28bedb54e8ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.041066 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nck8k\" (UniqueName: \"kubernetes.io/projected/d9989003-88f8-4fe7-889b-28bedb54e8ae-kube-api-access-nck8k\") pod \"d9989003-88f8-4fe7-889b-28bedb54e8ae\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.041125 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-utilities\") pod \"d9989003-88f8-4fe7-889b-28bedb54e8ae\" (UID: \"d9989003-88f8-4fe7-889b-28bedb54e8ae\") " Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.041473 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.041711 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgs9g"] Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.041973 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qgs9g" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="registry-server" containerID="cri-o://2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61" gracePeriod=2 Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.043320 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-utilities" (OuterVolumeSpecName: "utilities") pod "d9989003-88f8-4fe7-889b-28bedb54e8ae" (UID: "d9989003-88f8-4fe7-889b-28bedb54e8ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.051148 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9989003-88f8-4fe7-889b-28bedb54e8ae-kube-api-access-nck8k" (OuterVolumeSpecName: "kube-api-access-nck8k") pod "d9989003-88f8-4fe7-889b-28bedb54e8ae" (UID: "d9989003-88f8-4fe7-889b-28bedb54e8ae"). InnerVolumeSpecName "kube-api-access-nck8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.143196 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nck8k\" (UniqueName: \"kubernetes.io/projected/d9989003-88f8-4fe7-889b-28bedb54e8ae-kube-api-access-nck8k\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.143688 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9989003-88f8-4fe7-889b-28bedb54e8ae-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.456323 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.468749 4806 generic.go:334] "Generic (PLEG): container finished" podID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerID="696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36" exitCode=0 Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.468848 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xq5t" event={"ID":"d9989003-88f8-4fe7-889b-28bedb54e8ae","Type":"ContainerDied","Data":"696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36"} Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.468888 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2xq5t" event={"ID":"d9989003-88f8-4fe7-889b-28bedb54e8ae","Type":"ContainerDied","Data":"2f5244d398ba1a28d7939ca7c74528048e9efb2b510b2f51f4f20fc6f0de29cc"} Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.468896 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2xq5t" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.468919 4806 scope.go:117] "RemoveContainer" containerID="696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.470036 4806 generic.go:334] "Generic (PLEG): container finished" podID="53265568-e6c8-4d2c-8902-b6c65fdff13a" containerID="2bb6d25297d7a0a5f33057b3582f7164b7c859fb7665b01f92db39383ffa99e5" exitCode=0 Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.470172 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"53265568-e6c8-4d2c-8902-b6c65fdff13a","Type":"ContainerDied","Data":"2bb6d25297d7a0a5f33057b3582f7164b7c859fb7665b01f92db39383ffa99e5"} Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.510049 4806 generic.go:334] "Generic (PLEG): container finished" podID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerID="2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61" exitCode=0 Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.510129 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgs9g" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.510158 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgs9g" event={"ID":"83916ba1-8102-4101-8876-9a4abdc9e48f","Type":"ContainerDied","Data":"2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61"} Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.510365 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgs9g" event={"ID":"83916ba1-8102-4101-8876-9a4abdc9e48f","Type":"ContainerDied","Data":"170b588d58cd5a2b09fb847252a6549f88d2b42bad4f35eabafcf75fd8ae4075"} Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.526617 4806 scope.go:117] "RemoveContainer" containerID="2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.530889 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2xq5t"] Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.542062 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2xq5t"] Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.549916 4806 scope.go:117] "RemoveContainer" containerID="b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.566273 4806 scope.go:117] "RemoveContainer" containerID="696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36" Feb 17 15:23:54 crc kubenswrapper[4806]: E0217 15:23:54.566886 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36\": container with ID starting with 696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36 not found: ID does not exist" containerID="696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.566956 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36"} err="failed to get container status \"696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36\": rpc error: code = NotFound desc = could not find container \"696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36\": container with ID starting with 696b299e1ae6d9793c0d1169d56f38ee995aaf87adf214b9b6ed7770ec260a36 not found: ID does not exist" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.567023 4806 scope.go:117] "RemoveContainer" containerID="2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826" Feb 17 15:23:54 crc kubenswrapper[4806]: E0217 15:23:54.567479 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826\": container with ID starting with 2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826 not found: ID does not exist" containerID="2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.567524 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826"} err="failed to get container status \"2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826\": rpc error: code = NotFound desc = could not find container \"2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826\": container with ID starting with 2cbe86f55a27ddb3d69b351fb3b887a334a6a743417e7e2f99b36590a24f3826 not found: ID does not exist" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.567553 4806 scope.go:117] "RemoveContainer" containerID="b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24" Feb 17 15:23:54 crc kubenswrapper[4806]: E0217 15:23:54.568033 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24\": container with ID starting with b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24 not found: ID does not exist" containerID="b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.568080 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24"} err="failed to get container status \"b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24\": rpc error: code = NotFound desc = could not find container \"b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24\": container with ID starting with b224bb69677ba6d3a013d9f0104cf5cf8253b49c7239c01619155fe472cbeb24 not found: ID does not exist" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.568115 4806 scope.go:117] "RemoveContainer" containerID="2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.584955 4806 scope.go:117] "RemoveContainer" containerID="7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.606236 4806 scope.go:117] "RemoveContainer" containerID="6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.622489 4806 scope.go:117] "RemoveContainer" containerID="2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61" Feb 17 15:23:54 crc kubenswrapper[4806]: E0217 15:23:54.622799 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61\": container with ID starting with 2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61 not found: ID does not exist" containerID="2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.622835 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61"} err="failed to get container status \"2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61\": rpc error: code = NotFound desc = could not find container \"2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61\": container with ID starting with 2f81bda1f4bf6fe60734431f4182002204345c1ffe7f1cfc867daeb5cd8e8d61 not found: ID does not exist" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.622862 4806 scope.go:117] "RemoveContainer" containerID="7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b" Feb 17 15:23:54 crc kubenswrapper[4806]: E0217 15:23:54.623160 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b\": container with ID starting with 7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b not found: ID does not exist" containerID="7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.623185 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b"} err="failed to get container status \"7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b\": rpc error: code = NotFound desc = could not find container \"7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b\": container with ID starting with 7a6aedfc846b9fe6732fcd051ee0e05c402f60537f0cfcd52d9ab5b50d53b67b not found: ID does not exist" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.623204 4806 scope.go:117] "RemoveContainer" containerID="6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4" Feb 17 15:23:54 crc kubenswrapper[4806]: E0217 15:23:54.623527 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4\": container with ID starting with 6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4 not found: ID does not exist" containerID="6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.623552 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4"} err="failed to get container status \"6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4\": rpc error: code = NotFound desc = could not find container \"6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4\": container with ID starting with 6606373d303819b6998a4c5368cee91eb180df8f93aaabed4fc8673bd68d6ec4 not found: ID does not exist" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.649101 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gldp\" (UniqueName: \"kubernetes.io/projected/83916ba1-8102-4101-8876-9a4abdc9e48f-kube-api-access-4gldp\") pod \"83916ba1-8102-4101-8876-9a4abdc9e48f\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.649249 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-utilities\") pod \"83916ba1-8102-4101-8876-9a4abdc9e48f\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.649321 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-catalog-content\") pod \"83916ba1-8102-4101-8876-9a4abdc9e48f\" (UID: \"83916ba1-8102-4101-8876-9a4abdc9e48f\") " Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.650130 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-utilities" (OuterVolumeSpecName: "utilities") pod "83916ba1-8102-4101-8876-9a4abdc9e48f" (UID: "83916ba1-8102-4101-8876-9a4abdc9e48f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.655099 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83916ba1-8102-4101-8876-9a4abdc9e48f-kube-api-access-4gldp" (OuterVolumeSpecName: "kube-api-access-4gldp") pod "83916ba1-8102-4101-8876-9a4abdc9e48f" (UID: "83916ba1-8102-4101-8876-9a4abdc9e48f"). InnerVolumeSpecName "kube-api-access-4gldp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.681300 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83916ba1-8102-4101-8876-9a4abdc9e48f" (UID: "83916ba1-8102-4101-8876-9a4abdc9e48f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.751477 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gldp\" (UniqueName: \"kubernetes.io/projected/83916ba1-8102-4101-8876-9a4abdc9e48f-kube-api-access-4gldp\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.751533 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.751550 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83916ba1-8102-4101-8876-9a4abdc9e48f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.846849 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgs9g"] Feb 17 15:23:54 crc kubenswrapper[4806]: I0217 15:23:54.854399 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgs9g"] Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.169770 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" path="/var/lib/kubelet/pods/83916ba1-8102-4101-8876-9a4abdc9e48f/volumes" Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.170645 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" path="/var/lib/kubelet/pods/c5c38b9f-9f1b-42b9-924d-a7f9ff193e73/volumes" Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.171334 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" path="/var/lib/kubelet/pods/d9989003-88f8-4fe7-889b-28bedb54e8ae/volumes" Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.862855 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.966627 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53265568-e6c8-4d2c-8902-b6c65fdff13a-kube-api-access\") pod \"53265568-e6c8-4d2c-8902-b6c65fdff13a\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.966747 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53265568-e6c8-4d2c-8902-b6c65fdff13a-kubelet-dir\") pod \"53265568-e6c8-4d2c-8902-b6c65fdff13a\" (UID: \"53265568-e6c8-4d2c-8902-b6c65fdff13a\") " Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.966929 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53265568-e6c8-4d2c-8902-b6c65fdff13a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "53265568-e6c8-4d2c-8902-b6c65fdff13a" (UID: "53265568-e6c8-4d2c-8902-b6c65fdff13a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:23:55 crc kubenswrapper[4806]: I0217 15:23:55.971942 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53265568-e6c8-4d2c-8902-b6c65fdff13a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "53265568-e6c8-4d2c-8902-b6c65fdff13a" (UID: "53265568-e6c8-4d2c-8902-b6c65fdff13a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.068526 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53265568-e6c8-4d2c-8902-b6c65fdff13a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.068576 4806 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53265568-e6c8-4d2c-8902-b6c65fdff13a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.452230 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sbjk5"] Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.452651 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sbjk5" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="registry-server" containerID="cri-o://6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872" gracePeriod=2 Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.529698 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"53265568-e6c8-4d2c-8902-b6c65fdff13a","Type":"ContainerDied","Data":"c355c9fe3ace900bf02945a7201a129c559c965e53d0863742badbd44f1ec45d"} Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.529740 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.529756 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c355c9fe3ace900bf02945a7201a129c559c965e53d0863742badbd44f1ec45d" Feb 17 15:23:56 crc kubenswrapper[4806]: I0217 15:23:56.918590 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.080491 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/bf7442d4-1501-4230-98c0-d54f138f7c85-kube-api-access-ff5md\") pod \"bf7442d4-1501-4230-98c0-d54f138f7c85\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.080582 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-utilities\") pod \"bf7442d4-1501-4230-98c0-d54f138f7c85\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.080635 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-catalog-content\") pod \"bf7442d4-1501-4230-98c0-d54f138f7c85\" (UID: \"bf7442d4-1501-4230-98c0-d54f138f7c85\") " Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.082145 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-utilities" (OuterVolumeSpecName: "utilities") pod "bf7442d4-1501-4230-98c0-d54f138f7c85" (UID: "bf7442d4-1501-4230-98c0-d54f138f7c85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.085247 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7442d4-1501-4230-98c0-d54f138f7c85-kube-api-access-ff5md" (OuterVolumeSpecName: "kube-api-access-ff5md") pod "bf7442d4-1501-4230-98c0-d54f138f7c85" (UID: "bf7442d4-1501-4230-98c0-d54f138f7c85"). InnerVolumeSpecName "kube-api-access-ff5md". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.181591 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/bf7442d4-1501-4230-98c0-d54f138f7c85-kube-api-access-ff5md\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.181648 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.239035 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf7442d4-1501-4230-98c0-d54f138f7c85" (UID: "bf7442d4-1501-4230-98c0-d54f138f7c85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.283047 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf7442d4-1501-4230-98c0-d54f138f7c85-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.541783 4806 generic.go:334] "Generic (PLEG): container finished" podID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerID="6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872" exitCode=0 Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.541833 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbjk5" event={"ID":"bf7442d4-1501-4230-98c0-d54f138f7c85","Type":"ContainerDied","Data":"6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872"} Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.541862 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbjk5" event={"ID":"bf7442d4-1501-4230-98c0-d54f138f7c85","Type":"ContainerDied","Data":"a247183775654341e03a1ea5d35ebb5c9bdec57b5f9981a8393b3fd3ff639f03"} Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.541882 4806 scope.go:117] "RemoveContainer" containerID="6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.541883 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbjk5" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.573103 4806 scope.go:117] "RemoveContainer" containerID="9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.597533 4806 scope.go:117] "RemoveContainer" containerID="d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.599595 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sbjk5"] Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.602261 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sbjk5"] Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.614604 4806 scope.go:117] "RemoveContainer" containerID="6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.615140 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872\": container with ID starting with 6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872 not found: ID does not exist" containerID="6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.615190 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872"} err="failed to get container status \"6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872\": rpc error: code = NotFound desc = could not find container \"6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872\": container with ID starting with 6e608c0b27ff4a33bb6d5f8135780b9ea11ffeb0b25cfe09c3f405a3e3a45872 not found: ID does not exist" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.615220 4806 scope.go:117] "RemoveContainer" containerID="9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.615619 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69\": container with ID starting with 9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69 not found: ID does not exist" containerID="9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.615650 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69"} err="failed to get container status \"9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69\": rpc error: code = NotFound desc = could not find container \"9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69\": container with ID starting with 9ca46aca4e8cdab199bd355c8891c84ed158f70406cb979dc67975bed6e3bc69 not found: ID does not exist" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.615664 4806 scope.go:117] "RemoveContainer" containerID="d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.615929 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1\": container with ID starting with d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1 not found: ID does not exist" containerID="d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.615950 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1"} err="failed to get container status \"d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1\": rpc error: code = NotFound desc = could not find container \"d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1\": container with ID starting with d4dc00832531523bd9a6445ab5f7ccb684b8285d669ef54a3d34a2e2720321f1 not found: ID does not exist" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.670700 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671116 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671133 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671148 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671156 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671167 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671176 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671187 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671196 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671206 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671214 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671223 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671233 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671249 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53265568-e6c8-4d2c-8902-b6c65fdff13a" containerName="pruner" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671258 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="53265568-e6c8-4d2c-8902-b6c65fdff13a" containerName="pruner" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671268 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671276 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671291 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671299 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671307 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671316 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671328 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671338 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671348 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671356 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="extract-utilities" Feb 17 15:23:57 crc kubenswrapper[4806]: E0217 15:23:57.671367 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671375 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="extract-content" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671525 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671540 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="53265568-e6c8-4d2c-8902-b6c65fdff13a" containerName="pruner" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671553 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9989003-88f8-4fe7-889b-28bedb54e8ae" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671568 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="83916ba1-8102-4101-8876-9a4abdc9e48f" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.671580 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c38b9f-9f1b-42b9-924d-a7f9ff193e73" containerName="registry-server" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.672278 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.679011 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.679012 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.686432 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.695099 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.695158 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-var-lock\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.695363 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cf0e72c-ef97-46e0-9e67-044d1f893320-kube-api-access\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.796141 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.796187 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-var-lock\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.796252 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.796256 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cf0e72c-ef97-46e0-9e67-044d1f893320-kube-api-access\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.796321 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-var-lock\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.817169 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cf0e72c-ef97-46e0-9e67-044d1f893320-kube-api-access\") pod \"installer-9-crc\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:57 crc kubenswrapper[4806]: I0217 15:23:57.998481 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:23:58 crc kubenswrapper[4806]: I0217 15:23:58.218307 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:23:58 crc kubenswrapper[4806]: I0217 15:23:58.285452 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 15:23:58 crc kubenswrapper[4806]: W0217 15:23:58.303114 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6cf0e72c_ef97_46e0_9e67_044d1f893320.slice/crio-a9b06a9dad125d1fbcf399627eee4d0eca288b58aa896b0c4e002cceed885beb WatchSource:0}: Error finding container a9b06a9dad125d1fbcf399627eee4d0eca288b58aa896b0c4e002cceed885beb: Status 404 returned error can't find the container with id a9b06a9dad125d1fbcf399627eee4d0eca288b58aa896b0c4e002cceed885beb Feb 17 15:23:58 crc kubenswrapper[4806]: I0217 15:23:58.405361 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:23:58 crc kubenswrapper[4806]: I0217 15:23:58.548262 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6cf0e72c-ef97-46e0-9e67-044d1f893320","Type":"ContainerStarted","Data":"a9b06a9dad125d1fbcf399627eee4d0eca288b58aa896b0c4e002cceed885beb"} Feb 17 15:23:59 crc kubenswrapper[4806]: I0217 15:23:59.169266 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7442d4-1501-4230-98c0-d54f138f7c85" path="/var/lib/kubelet/pods/bf7442d4-1501-4230-98c0-d54f138f7c85/volumes" Feb 17 15:23:59 crc kubenswrapper[4806]: I0217 15:23:59.558375 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6cf0e72c-ef97-46e0-9e67-044d1f893320","Type":"ContainerStarted","Data":"c61c7c3a4ca50cc317926b697c7949cfe779e5a2d68ed853f1970fdd5009dc54"} Feb 17 15:23:59 crc kubenswrapper[4806]: I0217 15:23:59.581395 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.5813644829999998 podStartE2EDuration="2.581364483s" podCreationTimestamp="2026-02-17 15:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:23:59.580098382 +0000 UTC m=+201.110728793" watchObservedRunningTime="2026-02-17 15:23:59.581364483 +0000 UTC m=+201.111994934" Feb 17 15:24:01 crc kubenswrapper[4806]: I0217 15:24:01.629934 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:24:01 crc kubenswrapper[4806]: I0217 15:24:01.696392 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:24:04 crc kubenswrapper[4806]: I0217 15:24:04.785168 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:24:04 crc kubenswrapper[4806]: I0217 15:24:04.786016 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:24:04 crc kubenswrapper[4806]: I0217 15:24:04.786085 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:24:04 crc kubenswrapper[4806]: I0217 15:24:04.786920 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:24:04 crc kubenswrapper[4806]: I0217 15:24:04.787016 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79" gracePeriod=600 Feb 17 15:24:05 crc kubenswrapper[4806]: I0217 15:24:05.620934 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79" exitCode=0 Feb 17 15:24:05 crc kubenswrapper[4806]: I0217 15:24:05.621073 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79"} Feb 17 15:24:05 crc kubenswrapper[4806]: I0217 15:24:05.622083 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"d3c5eb91e36e273fb32d5dc251978788cfa8103bc407e8fe5a600b96d3fdeb1b"} Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.165332 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" podUID="10565cb3-8e68-4dd9-9bac-fc770b23825b" containerName="oauth-openshift" containerID="cri-o://ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34" gracePeriod=15 Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.616512 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.675368 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj"] Feb 17 15:24:14 crc kubenswrapper[4806]: E0217 15:24:14.675711 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10565cb3-8e68-4dd9-9bac-fc770b23825b" containerName="oauth-openshift" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.675733 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="10565cb3-8e68-4dd9-9bac-fc770b23825b" containerName="oauth-openshift" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.675837 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="10565cb3-8e68-4dd9-9bac-fc770b23825b" containerName="oauth-openshift" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.676271 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.681677 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj"] Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.689283 4806 generic.go:334] "Generic (PLEG): container finished" podID="10565cb3-8e68-4dd9-9bac-fc770b23825b" containerID="ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34" exitCode=0 Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.689328 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" event={"ID":"10565cb3-8e68-4dd9-9bac-fc770b23825b","Type":"ContainerDied","Data":"ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34"} Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.689356 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" event={"ID":"10565cb3-8e68-4dd9-9bac-fc770b23825b","Type":"ContainerDied","Data":"86d7a829cb2b38e776164deaff9d02e2f5c22a87db54d35be6fd68af132bf088"} Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.689376 4806 scope.go:117] "RemoveContainer" containerID="ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.689586 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dp9cm" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696540 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-idp-0-file-data\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696631 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-trusted-ca-bundle\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696661 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-session\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696704 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-error\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696747 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-login\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696783 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-ocp-branding-template\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696819 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2v4n\" (UniqueName: \"kubernetes.io/projected/10565cb3-8e68-4dd9-9bac-fc770b23825b-kube-api-access-n2v4n\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696857 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-dir\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696891 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-service-ca\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696916 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-provider-selection\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696939 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-policies\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.696964 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-serving-cert\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.697005 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-router-certs\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.697044 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-cliconfig\") pod \"10565cb3-8e68-4dd9-9bac-fc770b23825b\" (UID: \"10565cb3-8e68-4dd9-9bac-fc770b23825b\") " Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.698086 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.698810 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.698850 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.699508 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.703510 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.704791 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.706155 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.706630 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10565cb3-8e68-4dd9-9bac-fc770b23825b-kube-api-access-n2v4n" (OuterVolumeSpecName: "kube-api-access-n2v4n") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "kube-api-access-n2v4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.713896 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.714175 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.714650 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.714654 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.715038 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.717274 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "10565cb3-8e68-4dd9-9bac-fc770b23825b" (UID: "10565cb3-8e68-4dd9-9bac-fc770b23825b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.751351 4806 scope.go:117] "RemoveContainer" containerID="ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34" Feb 17 15:24:14 crc kubenswrapper[4806]: E0217 15:24:14.752005 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34\": container with ID starting with ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34 not found: ID does not exist" containerID="ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.752052 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34"} err="failed to get container status \"ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34\": rpc error: code = NotFound desc = could not find container \"ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34\": container with ID starting with ffafb505319fe16a89b7bb266224b6bceecb0dc2eda2d0ae9a1ce3001d87fe34 not found: ID does not exist" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.798893 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-audit-dir\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799248 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799338 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799448 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-audit-policies\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799525 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799614 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-login\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799688 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799763 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799843 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.799915 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800011 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbchm\" (UniqueName: \"kubernetes.io/projected/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-kube-api-access-wbchm\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800102 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800202 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-error\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800300 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-session\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800440 4806 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800507 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800576 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800634 4806 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800696 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800758 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800907 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.800965 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.801025 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.801085 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.801148 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.801222 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.801305 4806 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/10565cb3-8e68-4dd9-9bac-fc770b23825b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.801423 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2v4n\" (UniqueName: \"kubernetes.io/projected/10565cb3-8e68-4dd9-9bac-fc770b23825b-kube-api-access-n2v4n\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.903187 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.903271 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-error\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.903370 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-session\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904222 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-audit-dir\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904314 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904373 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904440 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-audit-dir\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904500 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-audit-policies\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904560 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904608 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-login\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904659 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904713 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904779 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904900 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.904991 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbchm\" (UniqueName: \"kubernetes.io/projected/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-kube-api-access-wbchm\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.906328 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-audit-policies\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.906990 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.907173 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.908100 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.909163 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-error\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.909175 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.910036 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-session\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.910530 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.911395 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-login\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.911691 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.912147 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.912547 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.928303 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbchm\" (UniqueName: \"kubernetes.io/projected/62b2d8ff-73a7-4d16-9c7d-f50280fdb501-kube-api-access-wbchm\") pod \"oauth-openshift-7cc79f59b7-2vfsj\" (UID: \"62b2d8ff-73a7-4d16-9c7d-f50280fdb501\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:14 crc kubenswrapper[4806]: I0217 15:24:14.996312 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.040455 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dp9cm"] Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.044420 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dp9cm"] Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.193587 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10565cb3-8e68-4dd9-9bac-fc770b23825b" path="/var/lib/kubelet/pods/10565cb3-8e68-4dd9-9bac-fc770b23825b/volumes" Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.341197 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj"] Feb 17 15:24:15 crc kubenswrapper[4806]: W0217 15:24:15.351618 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62b2d8ff_73a7_4d16_9c7d_f50280fdb501.slice/crio-c0a1020c3b5a0e81e5738ac61e90be8451042d61904130823253f9fc22b79620 WatchSource:0}: Error finding container c0a1020c3b5a0e81e5738ac61e90be8451042d61904130823253f9fc22b79620: Status 404 returned error can't find the container with id c0a1020c3b5a0e81e5738ac61e90be8451042d61904130823253f9fc22b79620 Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.701967 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" event={"ID":"62b2d8ff-73a7-4d16-9c7d-f50280fdb501","Type":"ContainerStarted","Data":"7640ec21149390ed989ed2fb9aee64fdb827a636dddc8dcb32b0bb1c8021eb54"} Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.702603 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" event={"ID":"62b2d8ff-73a7-4d16-9c7d-f50280fdb501","Type":"ContainerStarted","Data":"c0a1020c3b5a0e81e5738ac61e90be8451042d61904130823253f9fc22b79620"} Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.702649 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:15 crc kubenswrapper[4806]: I0217 15:24:15.730631 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" podStartSLOduration=26.730597928999998 podStartE2EDuration="26.730597929s" podCreationTimestamp="2026-02-17 15:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:24:15.725674259 +0000 UTC m=+217.256304720" watchObservedRunningTime="2026-02-17 15:24:15.730597929 +0000 UTC m=+217.261228380" Feb 17 15:24:16 crc kubenswrapper[4806]: I0217 15:24:16.124874 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7cc79f59b7-2vfsj" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.400952 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bltqq"] Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.403735 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bltqq" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="registry-server" containerID="cri-o://5e95a73cb1296494acce3298ca1c907495708b2653095089e700c2e631711ee0" gracePeriod=30 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.421630 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xh468"] Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.423783 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xh468" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="registry-server" containerID="cri-o://ea668d50c411e7747cf3613452948a4c600948cc8df0a185c52f26eab8ac129b" gracePeriod=30 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.436107 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nnnv"] Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.436301 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" podUID="d4c76f4b-80c1-409a-acba-39a9edf0c975" containerName="marketplace-operator" containerID="cri-o://b162d3bd96a4774cbb7c48533c1b2121c9a67d3d850962b55ffd80f8352d3062" gracePeriod=30 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.447340 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2t6"] Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.447629 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wn2t6" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="registry-server" containerID="cri-o://615b776fb0db2468a2b330b6922f07ec8a2beaa0be871285ae3d7bede2e7d17e" gracePeriod=30 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.461340 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7498n"] Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.461719 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7498n" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="registry-server" containerID="cri-o://24deb8ff2db3325f36a4451ede2f791743682a27eb66a744fdc7467de6c8ce5a" gracePeriod=30 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.472100 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbz4l"] Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.472875 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.484997 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbz4l"] Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.628227 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk7qk\" (UniqueName: \"kubernetes.io/projected/7cae252d-6eec-4e1e-a829-9b11b21c4d75-kube-api-access-xk7qk\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.628294 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cae252d-6eec-4e1e-a829-9b11b21c4d75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.628313 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cae252d-6eec-4e1e-a829-9b11b21c4d75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.729391 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk7qk\" (UniqueName: \"kubernetes.io/projected/7cae252d-6eec-4e1e-a829-9b11b21c4d75-kube-api-access-xk7qk\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.729473 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cae252d-6eec-4e1e-a829-9b11b21c4d75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.729493 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cae252d-6eec-4e1e-a829-9b11b21c4d75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.731681 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7cae252d-6eec-4e1e-a829-9b11b21c4d75-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.737967 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7cae252d-6eec-4e1e-a829-9b11b21c4d75-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.749208 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk7qk\" (UniqueName: \"kubernetes.io/projected/7cae252d-6eec-4e1e-a829-9b11b21c4d75-kube-api-access-xk7qk\") pod \"marketplace-operator-79b997595-gbz4l\" (UID: \"7cae252d-6eec-4e1e-a829-9b11b21c4d75\") " pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.814912 4806 generic.go:334] "Generic (PLEG): container finished" podID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerID="24deb8ff2db3325f36a4451ede2f791743682a27eb66a744fdc7467de6c8ce5a" exitCode=0 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.815330 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7498n" event={"ID":"81712631-a6c3-4b56-ad8c-dd51bc0d217b","Type":"ContainerDied","Data":"24deb8ff2db3325f36a4451ede2f791743682a27eb66a744fdc7467de6c8ce5a"} Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.818043 4806 generic.go:334] "Generic (PLEG): container finished" podID="d4c76f4b-80c1-409a-acba-39a9edf0c975" containerID="b162d3bd96a4774cbb7c48533c1b2121c9a67d3d850962b55ffd80f8352d3062" exitCode=0 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.818137 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" event={"ID":"d4c76f4b-80c1-409a-acba-39a9edf0c975","Type":"ContainerDied","Data":"b162d3bd96a4774cbb7c48533c1b2121c9a67d3d850962b55ffd80f8352d3062"} Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.832418 4806 generic.go:334] "Generic (PLEG): container finished" podID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerID="5e95a73cb1296494acce3298ca1c907495708b2653095089e700c2e631711ee0" exitCode=0 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.832470 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bltqq" event={"ID":"93c131a2-4035-4267-9ed4-a4aef44c7ca5","Type":"ContainerDied","Data":"5e95a73cb1296494acce3298ca1c907495708b2653095089e700c2e631711ee0"} Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.840835 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.858176 4806 generic.go:334] "Generic (PLEG): container finished" podID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerID="ea668d50c411e7747cf3613452948a4c600948cc8df0a185c52f26eab8ac129b" exitCode=0 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.858231 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xh468" event={"ID":"141fd58d-8ec4-45ea-af22-89b1c8a0444d","Type":"ContainerDied","Data":"ea668d50c411e7747cf3613452948a4c600948cc8df0a185c52f26eab8ac129b"} Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.859316 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.860541 4806 generic.go:334] "Generic (PLEG): container finished" podID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerID="615b776fb0db2468a2b330b6922f07ec8a2beaa0be871285ae3d7bede2e7d17e" exitCode=0 Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.860563 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2t6" event={"ID":"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49","Type":"ContainerDied","Data":"615b776fb0db2468a2b330b6922f07ec8a2beaa0be871285ae3d7bede2e7d17e"} Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.927547 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.940698 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.971576 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:24:29 crc kubenswrapper[4806]: I0217 15:24:29.972384 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033337 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-trusted-ca\") pod \"d4c76f4b-80c1-409a-acba-39a9edf0c975\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033465 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h777r\" (UniqueName: \"kubernetes.io/projected/141fd58d-8ec4-45ea-af22-89b1c8a0444d-kube-api-access-h777r\") pod \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033527 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-operator-metrics\") pod \"d4c76f4b-80c1-409a-acba-39a9edf0c975\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033624 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-catalog-content\") pod \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033673 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5df52\" (UniqueName: \"kubernetes.io/projected/93c131a2-4035-4267-9ed4-a4aef44c7ca5-kube-api-access-5df52\") pod \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033740 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-catalog-content\") pod \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033785 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-utilities\") pod \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\" (UID: \"93c131a2-4035-4267-9ed4-a4aef44c7ca5\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033822 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gfxc\" (UniqueName: \"kubernetes.io/projected/d4c76f4b-80c1-409a-acba-39a9edf0c975-kube-api-access-4gfxc\") pod \"d4c76f4b-80c1-409a-acba-39a9edf0c975\" (UID: \"d4c76f4b-80c1-409a-acba-39a9edf0c975\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.033869 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-utilities\") pod \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\" (UID: \"141fd58d-8ec4-45ea-af22-89b1c8a0444d\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.035327 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d4c76f4b-80c1-409a-acba-39a9edf0c975" (UID: "d4c76f4b-80c1-409a-acba-39a9edf0c975"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.036021 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-utilities" (OuterVolumeSpecName: "utilities") pod "93c131a2-4035-4267-9ed4-a4aef44c7ca5" (UID: "93c131a2-4035-4267-9ed4-a4aef44c7ca5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.036745 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-utilities" (OuterVolumeSpecName: "utilities") pod "141fd58d-8ec4-45ea-af22-89b1c8a0444d" (UID: "141fd58d-8ec4-45ea-af22-89b1c8a0444d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.040628 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/141fd58d-8ec4-45ea-af22-89b1c8a0444d-kube-api-access-h777r" (OuterVolumeSpecName: "kube-api-access-h777r") pod "141fd58d-8ec4-45ea-af22-89b1c8a0444d" (UID: "141fd58d-8ec4-45ea-af22-89b1c8a0444d"). InnerVolumeSpecName "kube-api-access-h777r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.041022 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c131a2-4035-4267-9ed4-a4aef44c7ca5-kube-api-access-5df52" (OuterVolumeSpecName: "kube-api-access-5df52") pod "93c131a2-4035-4267-9ed4-a4aef44c7ca5" (UID: "93c131a2-4035-4267-9ed4-a4aef44c7ca5"). InnerVolumeSpecName "kube-api-access-5df52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.043163 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d4c76f4b-80c1-409a-acba-39a9edf0c975" (UID: "d4c76f4b-80c1-409a-acba-39a9edf0c975"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.047812 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4c76f4b-80c1-409a-acba-39a9edf0c975-kube-api-access-4gfxc" (OuterVolumeSpecName: "kube-api-access-4gfxc") pod "d4c76f4b-80c1-409a-acba-39a9edf0c975" (UID: "d4c76f4b-80c1-409a-acba-39a9edf0c975"). InnerVolumeSpecName "kube-api-access-4gfxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.097613 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "141fd58d-8ec4-45ea-af22-89b1c8a0444d" (UID: "141fd58d-8ec4-45ea-af22-89b1c8a0444d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.130715 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93c131a2-4035-4267-9ed4-a4aef44c7ca5" (UID: "93c131a2-4035-4267-9ed4-a4aef44c7ca5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135221 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-utilities\") pod \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135305 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pqvn\" (UniqueName: \"kubernetes.io/projected/81712631-a6c3-4b56-ad8c-dd51bc0d217b-kube-api-access-4pqvn\") pod \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135329 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-utilities\") pod \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135345 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz5vf\" (UniqueName: \"kubernetes.io/projected/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-kube-api-access-bz5vf\") pod \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135372 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-catalog-content\") pod \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\" (UID: \"81712631-a6c3-4b56-ad8c-dd51bc0d217b\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135441 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-catalog-content\") pod \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\" (UID: \"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49\") " Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135644 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135661 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5df52\" (UniqueName: \"kubernetes.io/projected/93c131a2-4035-4267-9ed4-a4aef44c7ca5-kube-api-access-5df52\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135672 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135681 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c131a2-4035-4267-9ed4-a4aef44c7ca5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135690 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gfxc\" (UniqueName: \"kubernetes.io/projected/d4c76f4b-80c1-409a-acba-39a9edf0c975-kube-api-access-4gfxc\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135698 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/141fd58d-8ec4-45ea-af22-89b1c8a0444d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135707 4806 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135715 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h777r\" (UniqueName: \"kubernetes.io/projected/141fd58d-8ec4-45ea-af22-89b1c8a0444d-kube-api-access-h777r\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135723 4806 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d4c76f4b-80c1-409a-acba-39a9edf0c975-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.135913 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-utilities" (OuterVolumeSpecName: "utilities") pod "81712631-a6c3-4b56-ad8c-dd51bc0d217b" (UID: "81712631-a6c3-4b56-ad8c-dd51bc0d217b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.137810 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-utilities" (OuterVolumeSpecName: "utilities") pod "ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" (UID: "ecfd1573-8314-4dcd-9f0c-9c1f1b293c49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.152549 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81712631-a6c3-4b56-ad8c-dd51bc0d217b-kube-api-access-4pqvn" (OuterVolumeSpecName: "kube-api-access-4pqvn") pod "81712631-a6c3-4b56-ad8c-dd51bc0d217b" (UID: "81712631-a6c3-4b56-ad8c-dd51bc0d217b"). InnerVolumeSpecName "kube-api-access-4pqvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.152678 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-kube-api-access-bz5vf" (OuterVolumeSpecName: "kube-api-access-bz5vf") pod "ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" (UID: "ecfd1573-8314-4dcd-9f0c-9c1f1b293c49"). InnerVolumeSpecName "kube-api-access-bz5vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.170201 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gbz4l"] Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.186172 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" (UID: "ecfd1573-8314-4dcd-9f0c-9c1f1b293c49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.237262 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.237302 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.237310 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.237319 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pqvn\" (UniqueName: \"kubernetes.io/projected/81712631-a6c3-4b56-ad8c-dd51bc0d217b-kube-api-access-4pqvn\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.237330 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz5vf\" (UniqueName: \"kubernetes.io/projected/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49-kube-api-access-bz5vf\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.278739 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81712631-a6c3-4b56-ad8c-dd51bc0d217b" (UID: "81712631-a6c3-4b56-ad8c-dd51bc0d217b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.339886 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81712631-a6c3-4b56-ad8c-dd51bc0d217b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.870298 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" event={"ID":"7cae252d-6eec-4e1e-a829-9b11b21c4d75","Type":"ContainerStarted","Data":"f70f4a1670eacedfb257d55397c8a5c6b2fc3a96fcf6e919e3dc16572dfd6018"} Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.870907 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" event={"ID":"7cae252d-6eec-4e1e-a829-9b11b21c4d75","Type":"ContainerStarted","Data":"c4d94df17866bfccb675661f65d656dd7fd4e181c702e05f27e80fd979337c05"} Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.870935 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.872634 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" event={"ID":"d4c76f4b-80c1-409a-acba-39a9edf0c975","Type":"ContainerDied","Data":"98dd418d592b1751afbd4350e6aa635aa46b4b5dcdf1201100e105b7a5406ccf"} Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.872681 4806 scope.go:117] "RemoveContainer" containerID="b162d3bd96a4774cbb7c48533c1b2121c9a67d3d850962b55ffd80f8352d3062" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.873003 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nnnv" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.876147 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bltqq" event={"ID":"93c131a2-4035-4267-9ed4-a4aef44c7ca5","Type":"ContainerDied","Data":"7ca10d80a1d69086e006372cf95093e8e75fc2e85b41913075ba1b08ff41b737"} Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.876360 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bltqq" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.879330 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xh468" event={"ID":"141fd58d-8ec4-45ea-af22-89b1c8a0444d","Type":"ContainerDied","Data":"2bef659c7e3d0bc29863eba136718b6a3097d5dd68f5ad8ef9939c84d0b8a94b"} Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.879588 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xh468" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.882161 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2t6" event={"ID":"ecfd1573-8314-4dcd-9f0c-9c1f1b293c49","Type":"ContainerDied","Data":"f77def0fd30b6b5236034357cfef25166a37df61b56eab2f6ae82a7a232b20ec"} Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.882292 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2t6" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.889307 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.899188 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7498n" event={"ID":"81712631-a6c3-4b56-ad8c-dd51bc0d217b","Type":"ContainerDied","Data":"5c8ff183b6c8012de20852db3b3250d3636c50524846ac7e731f654cc57375a4"} Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.899381 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7498n" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.901228 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gbz4l" podStartSLOduration=1.901206362 podStartE2EDuration="1.901206362s" podCreationTimestamp="2026-02-17 15:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:24:30.893976046 +0000 UTC m=+232.424606477" watchObservedRunningTime="2026-02-17 15:24:30.901206362 +0000 UTC m=+232.431836773" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.903802 4806 scope.go:117] "RemoveContainer" containerID="5e95a73cb1296494acce3298ca1c907495708b2653095089e700c2e631711ee0" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.953970 4806 scope.go:117] "RemoveContainer" containerID="0f9f3aa3c58fd6672a6c55a0bc1c67036d03fbbf0a31dbd427d7cb37d4576fa4" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.967882 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2t6"] Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.989955 4806 scope.go:117] "RemoveContainer" containerID="0933e94b7c2edca3d4e6af5cb2d433012ef3a2f67ff840a6c7376c71cde3ea88" Feb 17 15:24:30 crc kubenswrapper[4806]: I0217 15:24:30.993435 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2t6"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.002007 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7498n"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.007795 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7498n"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.015909 4806 scope.go:117] "RemoveContainer" containerID="ea668d50c411e7747cf3613452948a4c600948cc8df0a185c52f26eab8ac129b" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.016075 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bltqq"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.017693 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bltqq"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.020135 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nnnv"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.023115 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nnnv"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.024640 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xh468"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.026753 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xh468"] Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.033280 4806 scope.go:117] "RemoveContainer" containerID="d7155ebb706bf7ec255b59c69e912fc63667e5e833c199b65b04589b73f3efd5" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.048456 4806 scope.go:117] "RemoveContainer" containerID="6d622350b309eed3474449f56cca0f3d5eb7abf832691b7e279aee964f80f36a" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.124488 4806 scope.go:117] "RemoveContainer" containerID="615b776fb0db2468a2b330b6922f07ec8a2beaa0be871285ae3d7bede2e7d17e" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.138775 4806 scope.go:117] "RemoveContainer" containerID="6f5add4e9acdf6facf060af3f84cdb9b6eec83f5d64e3ab56be0b5f9431b6af6" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.151712 4806 scope.go:117] "RemoveContainer" containerID="209e0c9a2d582d241b879d82849fd6165a9352adc56f653a8ff85c53476501a4" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.164140 4806 scope.go:117] "RemoveContainer" containerID="24deb8ff2db3325f36a4451ede2f791743682a27eb66a744fdc7467de6c8ce5a" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.167175 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" path="/var/lib/kubelet/pods/141fd58d-8ec4-45ea-af22-89b1c8a0444d/volumes" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.167993 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" path="/var/lib/kubelet/pods/81712631-a6c3-4b56-ad8c-dd51bc0d217b/volumes" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.168560 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" path="/var/lib/kubelet/pods/93c131a2-4035-4267-9ed4-a4aef44c7ca5/volumes" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.169147 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4c76f4b-80c1-409a-acba-39a9edf0c975" path="/var/lib/kubelet/pods/d4c76f4b-80c1-409a-acba-39a9edf0c975/volumes" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.169983 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" path="/var/lib/kubelet/pods/ecfd1573-8314-4dcd-9f0c-9c1f1b293c49/volumes" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.181762 4806 scope.go:117] "RemoveContainer" containerID="503555c017cda12438fff52458b83b67a835cab3b78623fc03ebf231e9f9aae0" Feb 17 15:24:31 crc kubenswrapper[4806]: I0217 15:24:31.209895 4806 scope.go:117] "RemoveContainer" containerID="2354912bf2ed3ec46ea8c9d827576fef0cab676c5cd7801d1edccc5f8c29cfda" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.614198 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vfjtk"] Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.614972 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.614995 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615016 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615030 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615048 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615060 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615075 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615086 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615104 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4c76f4b-80c1-409a-acba-39a9edf0c975" containerName="marketplace-operator" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615116 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4c76f4b-80c1-409a-acba-39a9edf0c975" containerName="marketplace-operator" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615136 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615148 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615163 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615175 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615190 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615202 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615218 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615229 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615245 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615259 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615274 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615286 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615305 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615317 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="extract-content" Feb 17 15:24:32 crc kubenswrapper[4806]: E0217 15:24:32.615332 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.615344 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="extract-utilities" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.616641 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="141fd58d-8ec4-45ea-af22-89b1c8a0444d" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.616690 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="81712631-a6c3-4b56-ad8c-dd51bc0d217b" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.616703 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4c76f4b-80c1-409a-acba-39a9edf0c975" containerName="marketplace-operator" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.616712 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfd1573-8314-4dcd-9f0c-9c1f1b293c49" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.616728 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c131a2-4035-4267-9ed4-a4aef44c7ca5" containerName="registry-server" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.617615 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.622300 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.633359 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfjtk"] Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.679044 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-utilities\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.679107 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvwsw\" (UniqueName: \"kubernetes.io/projected/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-kube-api-access-hvwsw\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.679147 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-catalog-content\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.780722 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-utilities\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.780808 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvwsw\" (UniqueName: \"kubernetes.io/projected/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-kube-api-access-hvwsw\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.780857 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-catalog-content\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.781316 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-utilities\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.781337 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-catalog-content\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.803808 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvwsw\" (UniqueName: \"kubernetes.io/projected/faa62cd1-70a5-4d9b-9d84-44e8dead8ea5-kube-api-access-hvwsw\") pod \"redhat-marketplace-vfjtk\" (UID: \"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5\") " pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.815261 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vpggz"] Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.816300 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.820563 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.827857 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpggz"] Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.882340 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-utilities\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.882424 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msk47\" (UniqueName: \"kubernetes.io/projected/0d056e41-b3c8-477d-9639-2134afcf7535-kube-api-access-msk47\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.882518 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-catalog-content\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.955620 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.983742 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-utilities\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.983812 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-catalog-content\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.983840 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msk47\" (UniqueName: \"kubernetes.io/projected/0d056e41-b3c8-477d-9639-2134afcf7535-kube-api-access-msk47\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.984882 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-utilities\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:32 crc kubenswrapper[4806]: I0217 15:24:32.985176 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-catalog-content\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.006051 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msk47\" (UniqueName: \"kubernetes.io/projected/0d056e41-b3c8-477d-9639-2134afcf7535-kube-api-access-msk47\") pod \"redhat-operators-vpggz\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.151341 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.393569 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfjtk"] Feb 17 15:24:33 crc kubenswrapper[4806]: W0217 15:24:33.404811 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa62cd1_70a5_4d9b_9d84_44e8dead8ea5.slice/crio-205bc57966d3568ed793077d3c34b0be55c8b57ea2a671f35bc9559a73c3276e WatchSource:0}: Error finding container 205bc57966d3568ed793077d3c34b0be55c8b57ea2a671f35bc9559a73c3276e: Status 404 returned error can't find the container with id 205bc57966d3568ed793077d3c34b0be55c8b57ea2a671f35bc9559a73c3276e Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.550426 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpggz"] Feb 17 15:24:33 crc kubenswrapper[4806]: W0217 15:24:33.560209 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d056e41_b3c8_477d_9639_2134afcf7535.slice/crio-fdf56cb9309e6edb4d8ce7f889c2eecc3ab7cd14f46fc16fc28828d9bf32b6ce WatchSource:0}: Error finding container fdf56cb9309e6edb4d8ce7f889c2eecc3ab7cd14f46fc16fc28828d9bf32b6ce: Status 404 returned error can't find the container with id fdf56cb9309e6edb4d8ce7f889c2eecc3ab7cd14f46fc16fc28828d9bf32b6ce Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.927798 4806 generic.go:334] "Generic (PLEG): container finished" podID="0d056e41-b3c8-477d-9639-2134afcf7535" containerID="988ab8e7d27c971268902ae4f61248a9c9effd1da13574fcbfb95cb7f2beb990" exitCode=0 Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.927882 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpggz" event={"ID":"0d056e41-b3c8-477d-9639-2134afcf7535","Type":"ContainerDied","Data":"988ab8e7d27c971268902ae4f61248a9c9effd1da13574fcbfb95cb7f2beb990"} Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.928020 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpggz" event={"ID":"0d056e41-b3c8-477d-9639-2134afcf7535","Type":"ContainerStarted","Data":"fdf56cb9309e6edb4d8ce7f889c2eecc3ab7cd14f46fc16fc28828d9bf32b6ce"} Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.930087 4806 generic.go:334] "Generic (PLEG): container finished" podID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" containerID="358c4d2198356352365da4c41b7aa2b926cf8704bf097813e222d7c1f6fd60d8" exitCode=0 Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.930153 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfjtk" event={"ID":"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5","Type":"ContainerDied","Data":"358c4d2198356352365da4c41b7aa2b926cf8704bf097813e222d7c1f6fd60d8"} Feb 17 15:24:33 crc kubenswrapper[4806]: I0217 15:24:33.930192 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfjtk" event={"ID":"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5","Type":"ContainerStarted","Data":"205bc57966d3568ed793077d3c34b0be55c8b57ea2a671f35bc9559a73c3276e"} Feb 17 15:24:34 crc kubenswrapper[4806]: I0217 15:24:34.937694 4806 generic.go:334] "Generic (PLEG): container finished" podID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" containerID="9dfd8f6505c0cd44e396f51d8be1cafb7384d1c14e08665a5bae84d85fac19fa" exitCode=0 Feb 17 15:24:34 crc kubenswrapper[4806]: I0217 15:24:34.937740 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfjtk" event={"ID":"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5","Type":"ContainerDied","Data":"9dfd8f6505c0cd44e396f51d8be1cafb7384d1c14e08665a5bae84d85fac19fa"} Feb 17 15:24:34 crc kubenswrapper[4806]: I0217 15:24:34.943995 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpggz" event={"ID":"0d056e41-b3c8-477d-9639-2134afcf7535","Type":"ContainerStarted","Data":"ac3e50098ae97e696ab0358be018560d988af169a740f9432ffa28494c130b44"} Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.019512 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m5972"] Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.021211 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.029707 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.033849 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m5972"] Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.116593 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00aa8f35-6b46-4bf7-9676-1b2721bc8981-catalog-content\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.116678 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00aa8f35-6b46-4bf7-9676-1b2721bc8981-utilities\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.116817 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc49q\" (UniqueName: \"kubernetes.io/projected/00aa8f35-6b46-4bf7-9676-1b2721bc8981-kube-api-access-sc49q\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.217873 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00aa8f35-6b46-4bf7-9676-1b2721bc8981-catalog-content\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.217945 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00aa8f35-6b46-4bf7-9676-1b2721bc8981-utilities\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.217979 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc49q\" (UniqueName: \"kubernetes.io/projected/00aa8f35-6b46-4bf7-9676-1b2721bc8981-kube-api-access-sc49q\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.219203 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00aa8f35-6b46-4bf7-9676-1b2721bc8981-utilities\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.219701 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00aa8f35-6b46-4bf7-9676-1b2721bc8981-catalog-content\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.221205 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9zk9n"] Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.222457 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.225520 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.242559 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc49q\" (UniqueName: \"kubernetes.io/projected/00aa8f35-6b46-4bf7-9676-1b2721bc8981-kube-api-access-sc49q\") pod \"community-operators-m5972\" (UID: \"00aa8f35-6b46-4bf7-9676-1b2721bc8981\") " pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.278095 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zk9n"] Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.319057 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44csl\" (UniqueName: \"kubernetes.io/projected/55d6f08f-1a64-42ba-8633-98e6b012ff7c-kube-api-access-44csl\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.319112 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d6f08f-1a64-42ba-8633-98e6b012ff7c-catalog-content\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.319188 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d6f08f-1a64-42ba-8633-98e6b012ff7c-utilities\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.338869 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.420851 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44csl\" (UniqueName: \"kubernetes.io/projected/55d6f08f-1a64-42ba-8633-98e6b012ff7c-kube-api-access-44csl\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.421118 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d6f08f-1a64-42ba-8633-98e6b012ff7c-catalog-content\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.421719 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55d6f08f-1a64-42ba-8633-98e6b012ff7c-catalog-content\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.421856 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d6f08f-1a64-42ba-8633-98e6b012ff7c-utilities\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.422124 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55d6f08f-1a64-42ba-8633-98e6b012ff7c-utilities\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.445362 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44csl\" (UniqueName: \"kubernetes.io/projected/55d6f08f-1a64-42ba-8633-98e6b012ff7c-kube-api-access-44csl\") pod \"certified-operators-9zk9n\" (UID: \"55d6f08f-1a64-42ba-8633-98e6b012ff7c\") " pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.542715 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.762624 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zk9n"] Feb 17 15:24:35 crc kubenswrapper[4806]: W0217 15:24:35.773823 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d6f08f_1a64_42ba_8633_98e6b012ff7c.slice/crio-7ca2571dfcc95b9dbf6ebe674536d7453bb89dd7e4fc66f1a46782c56760f6bf WatchSource:0}: Error finding container 7ca2571dfcc95b9dbf6ebe674536d7453bb89dd7e4fc66f1a46782c56760f6bf: Status 404 returned error can't find the container with id 7ca2571dfcc95b9dbf6ebe674536d7453bb89dd7e4fc66f1a46782c56760f6bf Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.811067 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m5972"] Feb 17 15:24:35 crc kubenswrapper[4806]: W0217 15:24:35.818027 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00aa8f35_6b46_4bf7_9676_1b2721bc8981.slice/crio-8524e29a2b7b7402e1b2a77eb52b1388d43145e7877e1cb8c83a7909b1f8547d WatchSource:0}: Error finding container 8524e29a2b7b7402e1b2a77eb52b1388d43145e7877e1cb8c83a7909b1f8547d: Status 404 returned error can't find the container with id 8524e29a2b7b7402e1b2a77eb52b1388d43145e7877e1cb8c83a7909b1f8547d Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.954972 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5972" event={"ID":"00aa8f35-6b46-4bf7-9676-1b2721bc8981","Type":"ContainerStarted","Data":"8524e29a2b7b7402e1b2a77eb52b1388d43145e7877e1cb8c83a7909b1f8547d"} Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.957105 4806 generic.go:334] "Generic (PLEG): container finished" podID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" containerID="f04874535924769eaff563029388236053b366cc83ac068d783bc0cd8e8b5b0e" exitCode=0 Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.957316 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk9n" event={"ID":"55d6f08f-1a64-42ba-8633-98e6b012ff7c","Type":"ContainerDied","Data":"f04874535924769eaff563029388236053b366cc83ac068d783bc0cd8e8b5b0e"} Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.957668 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk9n" event={"ID":"55d6f08f-1a64-42ba-8633-98e6b012ff7c","Type":"ContainerStarted","Data":"7ca2571dfcc95b9dbf6ebe674536d7453bb89dd7e4fc66f1a46782c56760f6bf"} Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.963617 4806 generic.go:334] "Generic (PLEG): container finished" podID="0d056e41-b3c8-477d-9639-2134afcf7535" containerID="ac3e50098ae97e696ab0358be018560d988af169a740f9432ffa28494c130b44" exitCode=0 Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.963682 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpggz" event={"ID":"0d056e41-b3c8-477d-9639-2134afcf7535","Type":"ContainerDied","Data":"ac3e50098ae97e696ab0358be018560d988af169a740f9432ffa28494c130b44"} Feb 17 15:24:35 crc kubenswrapper[4806]: I0217 15:24:35.972151 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfjtk" event={"ID":"faa62cd1-70a5-4d9b-9d84-44e8dead8ea5","Type":"ContainerStarted","Data":"8dcbd3c7e730129d13641bd9dbe95e5c2bfdb334c634ace359f28ec5a76ee7e4"} Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.027624 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vfjtk" podStartSLOduration=2.587661352 podStartE2EDuration="4.027608531s" podCreationTimestamp="2026-02-17 15:24:32 +0000 UTC" firstStartedPulling="2026-02-17 15:24:33.932604144 +0000 UTC m=+235.463234565" lastFinishedPulling="2026-02-17 15:24:35.372551293 +0000 UTC m=+236.903181744" observedRunningTime="2026-02-17 15:24:36.025365707 +0000 UTC m=+237.555996148" watchObservedRunningTime="2026-02-17 15:24:36.027608531 +0000 UTC m=+237.558238942" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.732186 4806 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.733192 4806 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.733325 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.733883 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f" gracePeriod=15 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.733895 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a" gracePeriod=15 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734009 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8" gracePeriod=15 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.733976 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b" gracePeriod=15 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734517 4806 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734012 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a" gracePeriod=15 Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734763 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734789 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734802 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734809 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734821 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734828 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734841 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734849 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734859 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734866 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734877 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734884 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734891 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734896 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.734904 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.734910 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.735022 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.735033 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.735042 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.735057 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.735090 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.735100 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.735312 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.778321 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 15:24:36 crc kubenswrapper[4806]: E0217 15:24:36.814493 4806 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-9zk9n.1895120d4f3914ed openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-9zk9n,UID:55d6f08f-1a64-42ba-8633-98e6b012ff7c,APIVersion:v1,ResourceVersion:29703,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/certified-operator-index:v4.18\" in 853ms (853ms including waiting). Image size: 1234637517 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 15:24:36.813173997 +0000 UTC m=+238.343804408,LastTimestamp:2026-02-17 15:24:36.813173997 +0000 UTC m=+238.343804408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.842769 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.842849 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.842881 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.842919 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.842943 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.842970 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.842985 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.843016 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943688 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943737 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943760 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943786 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943804 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943824 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943841 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943858 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943869 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943911 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943874 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943953 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943978 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943980 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.943981 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.944014 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.982899 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpggz" event={"ID":"0d056e41-b3c8-477d-9639-2134afcf7535","Type":"ContainerStarted","Data":"87d608d9be99788b64ffb2987212bd05d3c35d5629cded57d7c27edcdac8c318"} Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.983989 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.984161 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.984312 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.987555 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.988969 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.989758 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f" exitCode=0 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.989795 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a" exitCode=0 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.989812 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a" exitCode=0 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.989826 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b" exitCode=2 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.989853 4806 scope.go:117] "RemoveContainer" containerID="c6eb1cf4e28d4162928df9d7b69a0cb385a58ccf2bfe15289fd1ac3e39f7513b" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.991745 4806 generic.go:334] "Generic (PLEG): container finished" podID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" containerID="8b416cf8a0cb0035bdf814354878022c5977252ad7769e84518749a930c6edf5" exitCode=0 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.991798 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5972" event={"ID":"00aa8f35-6b46-4bf7-9676-1b2721bc8981","Type":"ContainerDied","Data":"8b416cf8a0cb0035bdf814354878022c5977252ad7769e84518749a930c6edf5"} Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.992523 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.992849 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.993397 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.993827 4806 generic.go:334] "Generic (PLEG): container finished" podID="6cf0e72c-ef97-46e0-9e67-044d1f893320" containerID="c61c7c3a4ca50cc317926b697c7949cfe779e5a2d68ed853f1970fdd5009dc54" exitCode=0 Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.993918 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6cf0e72c-ef97-46e0-9e67-044d1f893320","Type":"ContainerDied","Data":"c61c7c3a4ca50cc317926b697c7949cfe779e5a2d68ed853f1970fdd5009dc54"} Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.994065 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.994767 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.995103 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.995584 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.996131 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:36 crc kubenswrapper[4806]: I0217 15:24:36.996529 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:37 crc kubenswrapper[4806]: I0217 15:24:37.069354 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:24:37 crc kubenswrapper[4806]: W0217 15:24:37.091361 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-ecc751d180edabd43a05b317616ef580d419e1608f6081b23dbe714b420eb729 WatchSource:0}: Error finding container ecc751d180edabd43a05b317616ef580d419e1608f6081b23dbe714b420eb729: Status 404 returned error can't find the container with id ecc751d180edabd43a05b317616ef580d419e1608f6081b23dbe714b420eb729 Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.003020 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.005523 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5972" event={"ID":"00aa8f35-6b46-4bf7-9676-1b2721bc8981","Type":"ContainerStarted","Data":"94be03c95585a6c6c67700a35efefa5fb1a37ea564f8b4c6e59aecde88241d2a"} Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.006564 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.006718 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.006868 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.007209 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.008105 4806 generic.go:334] "Generic (PLEG): container finished" podID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" containerID="200c892dadc26f1f72d5bd2731578c3ae3f0d4a5fe9fda643be0dbe65056b414" exitCode=0 Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.008164 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk9n" event={"ID":"55d6f08f-1a64-42ba-8633-98e6b012ff7c","Type":"ContainerDied","Data":"200c892dadc26f1f72d5bd2731578c3ae3f0d4a5fe9fda643be0dbe65056b414"} Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.012535 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.012858 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.017629 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.018121 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.018397 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.018472 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"73c291f28f9e5d9e90a3c8c2571e1f5d971573ae8724eb803be9a74737105f21"} Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.018496 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ecc751d180edabd43a05b317616ef580d419e1608f6081b23dbe714b420eb729"} Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.019029 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.019228 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.019430 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.019579 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.019713 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.291287 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.292175 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.292583 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.292824 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.293030 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.293254 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.365902 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-var-lock\") pod \"6cf0e72c-ef97-46e0-9e67-044d1f893320\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.365971 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-kubelet-dir\") pod \"6cf0e72c-ef97-46e0-9e67-044d1f893320\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.366086 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cf0e72c-ef97-46e0-9e67-044d1f893320-kube-api-access\") pod \"6cf0e72c-ef97-46e0-9e67-044d1f893320\" (UID: \"6cf0e72c-ef97-46e0-9e67-044d1f893320\") " Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.366166 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-var-lock" (OuterVolumeSpecName: "var-lock") pod "6cf0e72c-ef97-46e0-9e67-044d1f893320" (UID: "6cf0e72c-ef97-46e0-9e67-044d1f893320"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.366212 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6cf0e72c-ef97-46e0-9e67-044d1f893320" (UID: "6cf0e72c-ef97-46e0-9e67-044d1f893320"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.366347 4806 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.366360 4806 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6cf0e72c-ef97-46e0-9e67-044d1f893320-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.372314 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf0e72c-ef97-46e0-9e67-044d1f893320-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6cf0e72c-ef97-46e0-9e67-044d1f893320" (UID: "6cf0e72c-ef97-46e0-9e67-044d1f893320"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:24:38 crc kubenswrapper[4806]: I0217 15:24:38.468198 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cf0e72c-ef97-46e0-9e67-044d1f893320-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.029850 4806 generic.go:334] "Generic (PLEG): container finished" podID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" containerID="94be03c95585a6c6c67700a35efefa5fb1a37ea564f8b4c6e59aecde88241d2a" exitCode=0 Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.030162 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5972" event={"ID":"00aa8f35-6b46-4bf7-9676-1b2721bc8981","Type":"ContainerDied","Data":"94be03c95585a6c6c67700a35efefa5fb1a37ea564f8b4c6e59aecde88241d2a"} Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.031613 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.031981 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.032380 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.032645 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.032893 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.033813 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zk9n" event={"ID":"55d6f08f-1a64-42ba-8633-98e6b012ff7c","Type":"ContainerStarted","Data":"a167ee41642a19a504ff25c0a04e423a6be790d61ce93fcfacc2a45d52d121e8"} Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.034627 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.035061 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.035453 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.035719 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.035968 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.040613 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6cf0e72c-ef97-46e0-9e67-044d1f893320","Type":"ContainerDied","Data":"a9b06a9dad125d1fbcf399627eee4d0eca288b58aa896b0c4e002cceed885beb"} Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.040651 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b06a9dad125d1fbcf399627eee4d0eca288b58aa896b0c4e002cceed885beb" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.040787 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.095700 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.096506 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.096800 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.097042 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.097278 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.099971 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.101503 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.101880 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.102164 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.102320 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.102484 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.102638 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.102805 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.163621 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.164490 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.165177 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.165357 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.166262 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.166461 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.177100 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.177184 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.177202 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.177885 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.177916 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.177931 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.278699 4806 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.278725 4806 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:39 crc kubenswrapper[4806]: I0217 15:24:39.278733 4806 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.051490 4806 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8" exitCode=0 Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.052342 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.054086 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.054556 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.054834 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.054978 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.055115 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.055253 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.224153 4806 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.224478 4806 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.224767 4806 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.225307 4806 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.225624 4806 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.225653 4806 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.225852 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="200ms" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.274800 4806 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.115s" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.274964 4806 scope.go:117] "RemoveContainer" containerID="7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.301369 4806 scope.go:117] "RemoveContainer" containerID="cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.302451 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.302789 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.303044 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.303282 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.303527 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.303829 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.325641 4806 scope.go:117] "RemoveContainer" containerID="73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.337343 4806 scope.go:117] "RemoveContainer" containerID="3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.354378 4806 scope.go:117] "RemoveContainer" containerID="04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.380276 4806 scope.go:117] "RemoveContainer" containerID="dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.415047 4806 scope.go:117] "RemoveContainer" containerID="7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.416269 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\": container with ID starting with 7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f not found: ID does not exist" containerID="7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.416309 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f"} err="failed to get container status \"7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\": rpc error: code = NotFound desc = could not find container \"7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f\": container with ID starting with 7ce06f4ad6c950862afa6e4925aea82913ac852d7b048191459d3cd618bd4b4f not found: ID does not exist" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.416337 4806 scope.go:117] "RemoveContainer" containerID="cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.416774 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\": container with ID starting with cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a not found: ID does not exist" containerID="cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.416799 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a"} err="failed to get container status \"cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\": rpc error: code = NotFound desc = could not find container \"cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a\": container with ID starting with cf3cb51f870c81ab8b947caf42c6564dbb37e248e93856d67b56335451d91a4a not found: ID does not exist" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.416824 4806 scope.go:117] "RemoveContainer" containerID="73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.417176 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\": container with ID starting with 73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a not found: ID does not exist" containerID="73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.417215 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a"} err="failed to get container status \"73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\": rpc error: code = NotFound desc = could not find container \"73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a\": container with ID starting with 73f80cb64abec9b311c4643f24bdfa91fa743c6910b7cc46d40e0dc1e973bb5a not found: ID does not exist" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.417250 4806 scope.go:117] "RemoveContainer" containerID="3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.417744 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\": container with ID starting with 3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b not found: ID does not exist" containerID="3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.417775 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b"} err="failed to get container status \"3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\": rpc error: code = NotFound desc = could not find container \"3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b\": container with ID starting with 3ea71be14f4335b74729114f12f1eb2e15d36d2a158d44956a2bcc941834118b not found: ID does not exist" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.417798 4806 scope.go:117] "RemoveContainer" containerID="04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.418190 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\": container with ID starting with 04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8 not found: ID does not exist" containerID="04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.418232 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8"} err="failed to get container status \"04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\": rpc error: code = NotFound desc = could not find container \"04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8\": container with ID starting with 04152ab0782105f938c56098fa44b6fcee9c19db970d784856bc5ae0765d85c8 not found: ID does not exist" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.418261 4806 scope.go:117] "RemoveContainer" containerID="dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.418662 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\": container with ID starting with dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90 not found: ID does not exist" containerID="dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90" Feb 17 15:24:40 crc kubenswrapper[4806]: I0217 15:24:40.418693 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90"} err="failed to get container status \"dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\": rpc error: code = NotFound desc = could not find container \"dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90\": container with ID starting with dff223eee3a5935b775ed2109fbda24092fb16fffd4b06d05b1193e6b1f76a90 not found: ID does not exist" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.426361 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="400ms" Feb 17 15:24:40 crc kubenswrapper[4806]: E0217 15:24:40.827807 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="800ms" Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.069258 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5972" event={"ID":"00aa8f35-6b46-4bf7-9676-1b2721bc8981","Type":"ContainerStarted","Data":"e627ebdb8edb7dde1ae7deda59096ccfc2fb123469db3418ba97ea0cf6d5f358"} Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.070945 4806 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.071323 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.072087 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.072628 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.072932 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.073239 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:41 crc kubenswrapper[4806]: I0217 15:24:41.168899 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 15:24:41 crc kubenswrapper[4806]: E0217 15:24:41.508916 4806 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-9zk9n.1895120d4f3914ed openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-9zk9n,UID:55d6f08f-1a64-42ba-8633-98e6b012ff7c,APIVersion:v1,ResourceVersion:29703,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/certified-operator-index:v4.18\" in 853ms (853ms including waiting). Image size: 1234637517 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 15:24:36.813173997 +0000 UTC m=+238.343804408,LastTimestamp:2026-02-17 15:24:36.813173997 +0000 UTC m=+238.343804408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 15:24:41 crc kubenswrapper[4806]: E0217 15:24:41.629649 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="1.6s" Feb 17 15:24:42 crc kubenswrapper[4806]: I0217 15:24:42.956518 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:42 crc kubenswrapper[4806]: I0217 15:24:42.957031 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.007801 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.008314 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.008585 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.008931 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.009107 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.009249 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.009444 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.143012 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vfjtk" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.143600 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.143843 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.144134 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.144519 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.144806 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.145029 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.152160 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.152195 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.189906 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.190345 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.190796 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.191376 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.191722 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.191997 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: I0217 15:24:43.192310 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:43 crc kubenswrapper[4806]: E0217 15:24:43.230657 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="3.2s" Feb 17 15:24:44 crc kubenswrapper[4806]: I0217 15:24:44.136209 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:24:44 crc kubenswrapper[4806]: I0217 15:24:44.136996 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:44 crc kubenswrapper[4806]: I0217 15:24:44.137487 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:44 crc kubenswrapper[4806]: I0217 15:24:44.137774 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:44 crc kubenswrapper[4806]: I0217 15:24:44.138197 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:44 crc kubenswrapper[4806]: I0217 15:24:44.138770 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:44 crc kubenswrapper[4806]: I0217 15:24:44.139091 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.339711 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.340227 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.395479 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.396184 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.396840 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.397325 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.397742 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.398079 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.398505 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.543950 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.544317 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.597476 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.598041 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.598443 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.598881 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.599176 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.599651 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:45 crc kubenswrapper[4806]: I0217 15:24:45.600030 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.147193 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m5972" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.147595 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.147806 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.147963 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.148113 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.148272 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.148455 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.149137 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zk9n" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.149340 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.149650 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.150052 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.150237 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.150443 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: I0217 15:24:46.150630 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:46 crc kubenswrapper[4806]: E0217 15:24:46.431554 4806 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.12:6443: connect: connection refused" interval="6.4s" Feb 17 15:24:49 crc kubenswrapper[4806]: I0217 15:24:49.165476 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:49 crc kubenswrapper[4806]: I0217 15:24:49.166481 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:49 crc kubenswrapper[4806]: I0217 15:24:49.166876 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:49 crc kubenswrapper[4806]: I0217 15:24:49.167255 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:49 crc kubenswrapper[4806]: I0217 15:24:49.167623 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:49 crc kubenswrapper[4806]: I0217 15:24:49.167963 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:50 crc kubenswrapper[4806]: I0217 15:24:50.922613 4806 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 17 15:24:50 crc kubenswrapper[4806]: I0217 15:24:50.922675 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.128182 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.128239 4806 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345" exitCode=1 Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.128269 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345"} Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.128747 4806 scope.go:117] "RemoveContainer" containerID="6ea6c678b3e568f202b3732caee05df7a0f2636f5f044f6dae88a86c5a149345" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.129272 4806 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.129691 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.130090 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.130527 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.130841 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.131151 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.131468 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.160208 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.161973 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.162655 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.162961 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.163249 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.163594 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.163817 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.164087 4806 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.188529 4806 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.188569 4806 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:24:51 crc kubenswrapper[4806]: E0217 15:24:51.189334 4806 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.190175 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:51 crc kubenswrapper[4806]: E0217 15:24:51.510478 4806 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.12:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-9zk9n.1895120d4f3914ed openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-9zk9n,UID:55d6f08f-1a64-42ba-8633-98e6b012ff7c,APIVersion:v1,ResourceVersion:29703,FieldPath:spec.initContainers{extract-content},},Reason:Pulled,Message:Successfully pulled image \"registry.redhat.io/redhat/certified-operator-index:v4.18\" in 853ms (853ms including waiting). Image size: 1234637517 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 15:24:36.813173997 +0000 UTC m=+238.343804408,LastTimestamp:2026-02-17 15:24:36.813173997 +0000 UTC m=+238.343804408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 15:24:51 crc kubenswrapper[4806]: I0217 15:24:51.833866 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.139803 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.140493 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5e04563dbb843198eeaf0c47941135250ec55a4eb72f209f22ae654973cfffec"} Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.141775 4806 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.142476 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.142819 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.143261 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.143632 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.143885 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.144086 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.144639 4806 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a4e0c6bcc5977b2ad8018a0373dc33c8d1fe15d2f24af8204c8ad1b339a572ea" exitCode=0 Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.144674 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a4e0c6bcc5977b2ad8018a0373dc33c8d1fe15d2f24af8204c8ad1b339a572ea"} Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.144694 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"32e421541a4fd90492beece45de452125f6000ea3f613b81cf96000c51a34610"} Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.144968 4806 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.144987 4806 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:24:52 crc kubenswrapper[4806]: E0217 15:24:52.145187 4806 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.145538 4806 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.145968 4806 status_manager.go:851] "Failed to get status for pod" podUID="00aa8f35-6b46-4bf7-9676-1b2721bc8981" pod="openshift-marketplace/community-operators-m5972" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m5972\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.146352 4806 status_manager.go:851] "Failed to get status for pod" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" pod="openshift-marketplace/redhat-operators-vpggz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vpggz\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.146784 4806 status_manager.go:851] "Failed to get status for pod" podUID="55d6f08f-1a64-42ba-8633-98e6b012ff7c" pod="openshift-marketplace/certified-operators-9zk9n" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zk9n\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.147012 4806 status_manager.go:851] "Failed to get status for pod" podUID="faa62cd1-70a5-4d9b-9d84-44e8dead8ea5" pod="openshift-marketplace/redhat-marketplace-vfjtk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vfjtk\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.147223 4806 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:52 crc kubenswrapper[4806]: I0217 15:24:52.147453 4806 status_manager.go:851] "Failed to get status for pod" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.12:6443: connect: connection refused" Feb 17 15:24:53 crc kubenswrapper[4806]: I0217 15:24:53.154697 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"41696848b724eae66bacde10c65a95c8588a6059a776a2eb2696fed58fdc9dd5"} Feb 17 15:24:53 crc kubenswrapper[4806]: I0217 15:24:53.155146 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80e3c383b4f1b53e3c6674bc95ebabda56f56e6aa91c5fbe734dfa5633008acb"} Feb 17 15:24:53 crc kubenswrapper[4806]: I0217 15:24:53.155163 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"23bd158f13bafa5a8429f5129e35e7cde6eb6e666129a8c5d4b7a32aa8d0191d"} Feb 17 15:24:54 crc kubenswrapper[4806]: I0217 15:24:54.171761 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d9f2c238f5e17ec6924b87cb489fb3344ce395ebeb0b0d88a350b6cdb706e104"} Feb 17 15:24:54 crc kubenswrapper[4806]: I0217 15:24:54.171818 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc95c56f103d50a8dbe39f7dddac3dcf38bed90f4df5c673739b94f701a76a32"} Feb 17 15:24:54 crc kubenswrapper[4806]: I0217 15:24:54.172046 4806 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:24:54 crc kubenswrapper[4806]: I0217 15:24:54.172061 4806 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:24:54 crc kubenswrapper[4806]: I0217 15:24:54.172286 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:56 crc kubenswrapper[4806]: I0217 15:24:56.190809 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:56 crc kubenswrapper[4806]: I0217 15:24:56.191252 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:56 crc kubenswrapper[4806]: I0217 15:24:56.207216 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:58 crc kubenswrapper[4806]: I0217 15:24:58.279364 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:24:58 crc kubenswrapper[4806]: I0217 15:24:58.288227 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:24:59 crc kubenswrapper[4806]: I0217 15:24:59.188621 4806 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:24:59 crc kubenswrapper[4806]: I0217 15:24:59.201231 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:24:59 crc kubenswrapper[4806]: I0217 15:24:59.310872 4806 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="280b4ab7-8afa-4c71-9a64-e2ba12bcb21c" Feb 17 15:25:00 crc kubenswrapper[4806]: I0217 15:25:00.206113 4806 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:25:00 crc kubenswrapper[4806]: I0217 15:25:00.206485 4806 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cfc1d3b4-b4b9-43e6-bed6-9377a2dd9f16" Feb 17 15:25:00 crc kubenswrapper[4806]: I0217 15:25:00.210460 4806 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="280b4ab7-8afa-4c71-9a64-e2ba12bcb21c" Feb 17 15:25:01 crc kubenswrapper[4806]: I0217 15:25:01.838371 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 15:25:08 crc kubenswrapper[4806]: I0217 15:25:08.171650 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 15:25:09 crc kubenswrapper[4806]: I0217 15:25:09.274838 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 15:25:09 crc kubenswrapper[4806]: I0217 15:25:09.632013 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 15:25:09 crc kubenswrapper[4806]: I0217 15:25:09.775028 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 15:25:09 crc kubenswrapper[4806]: I0217 15:25:09.995106 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.077529 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.111356 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.223562 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.241258 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.253910 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.268357 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.412258 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.676834 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.688745 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.736073 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 15:25:10 crc kubenswrapper[4806]: I0217 15:25:10.992592 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 15:25:11 crc kubenswrapper[4806]: I0217 15:25:11.141439 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 15:25:11 crc kubenswrapper[4806]: I0217 15:25:11.671782 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 15:25:11 crc kubenswrapper[4806]: I0217 15:25:11.702490 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 15:25:11 crc kubenswrapper[4806]: I0217 15:25:11.772379 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 15:25:11 crc kubenswrapper[4806]: I0217 15:25:11.871496 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 15:25:11 crc kubenswrapper[4806]: I0217 15:25:11.997903 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.024799 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.113531 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.130582 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.162643 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.192283 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.301898 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.317107 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.382836 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.442344 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.447835 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.605328 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.655336 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.673559 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.687246 4806 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.688133 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vpggz" podStartSLOduration=38.281635668 podStartE2EDuration="40.688110364s" podCreationTimestamp="2026-02-17 15:24:32 +0000 UTC" firstStartedPulling="2026-02-17 15:24:33.929740425 +0000 UTC m=+235.460370836" lastFinishedPulling="2026-02-17 15:24:36.336215111 +0000 UTC m=+237.866845532" observedRunningTime="2026-02-17 15:24:59.21314461 +0000 UTC m=+260.743775031" watchObservedRunningTime="2026-02-17 15:25:12.688110364 +0000 UTC m=+274.218740785" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.689378 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m5972" podStartSLOduration=36.106877785 podStartE2EDuration="38.689371034s" podCreationTimestamp="2026-02-17 15:24:34 +0000 UTC" firstStartedPulling="2026-02-17 15:24:36.99659933 +0000 UTC m=+238.527229741" lastFinishedPulling="2026-02-17 15:24:39.579092579 +0000 UTC m=+241.109722990" observedRunningTime="2026-02-17 15:24:59.27353095 +0000 UTC m=+260.804161421" watchObservedRunningTime="2026-02-17 15:25:12.689371034 +0000 UTC m=+274.220001455" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.689733 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.689728582 podStartE2EDuration="36.689728582s" podCreationTimestamp="2026-02-17 15:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:24:59.246925282 +0000 UTC m=+260.777555683" watchObservedRunningTime="2026-02-17 15:25:12.689728582 +0000 UTC m=+274.220359003" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.689807 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9zk9n" podStartSLOduration=34.803493661 podStartE2EDuration="37.689801914s" podCreationTimestamp="2026-02-17 15:24:35 +0000 UTC" firstStartedPulling="2026-02-17 15:24:35.959801351 +0000 UTC m=+237.490431762" lastFinishedPulling="2026-02-17 15:24:38.846109604 +0000 UTC m=+240.376740015" observedRunningTime="2026-02-17 15:24:59.226471664 +0000 UTC m=+260.757102085" watchObservedRunningTime="2026-02-17 15:25:12.689801914 +0000 UTC m=+274.220432345" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.693385 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.693472 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.698388 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.721512 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.721495018 podStartE2EDuration="13.721495018s" podCreationTimestamp="2026-02-17 15:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:25:12.711319776 +0000 UTC m=+274.241950197" watchObservedRunningTime="2026-02-17 15:25:12.721495018 +0000 UTC m=+274.252125429" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.722977 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.779486 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.781233 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.785683 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.799653 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.835739 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.852092 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.868724 4806 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.870908 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.932263 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 15:25:12 crc kubenswrapper[4806]: I0217 15:25:12.955012 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.113794 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.159739 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.175454 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.191864 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.217195 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.300000 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.300870 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.324092 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.332045 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.359533 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.361924 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.656842 4806 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.767934 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.768914 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.782166 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.816526 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.858376 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.910266 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.936994 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 15:25:13 crc kubenswrapper[4806]: I0217 15:25:13.987575 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.055297 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.135329 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.295209 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.298906 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.396342 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.432042 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.446511 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.637535 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.707027 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.757025 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.792868 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.817999 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 15:25:14 crc kubenswrapper[4806]: I0217 15:25:14.962366 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.010552 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.087714 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.088812 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.100443 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.112333 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.176273 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.179339 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.432583 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.553109 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.555489 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.579423 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.607443 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.610236 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.617791 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.656305 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.678602 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.859108 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.900233 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.919623 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.995856 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 15:25:15 crc kubenswrapper[4806]: I0217 15:25:15.999331 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.000740 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.056326 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.114149 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.197103 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.441673 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.446073 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.559958 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.563569 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.564126 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.616372 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.649559 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.724221 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.744535 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.766109 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.880708 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.916954 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 15:25:16 crc kubenswrapper[4806]: I0217 15:25:16.939776 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.050051 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.187372 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.220035 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.232394 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.285024 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.322683 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.471650 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.486152 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.539068 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.552521 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.623367 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.638043 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.656886 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.692708 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.869582 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.929189 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 15:25:17 crc kubenswrapper[4806]: I0217 15:25:17.985440 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.020395 4806 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.063968 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.086808 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.121971 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.123823 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.299539 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.323418 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.488108 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.565396 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.582158 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.650117 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.715158 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.765294 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.796733 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.828198 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.859898 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.926366 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.985680 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 15:25:18 crc kubenswrapper[4806]: I0217 15:25:18.996156 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.145957 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.148287 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.174539 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.226457 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.238620 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.254670 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.270253 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.294149 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.331385 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.437750 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.453616 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.485814 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.487749 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.599843 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.649630 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.677181 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.677766 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.738217 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.803178 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.898191 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.924168 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.935503 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.944760 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.981893 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 15:25:19 crc kubenswrapper[4806]: I0217 15:25:19.981949 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.111620 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.137699 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.144015 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.185293 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.269043 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.278104 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.281175 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.357282 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.365814 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.366696 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.375938 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.387353 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.449444 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.478043 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.527912 4806 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.650022 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.699173 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.717337 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.717976 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.802099 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.828859 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.834049 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.975533 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 15:25:20 crc kubenswrapper[4806]: I0217 15:25:20.985760 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.005713 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.010488 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.027538 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.037304 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.097935 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.269271 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.280163 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.286867 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.325671 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.423325 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.443460 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.450006 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.459044 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.583493 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.618837 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.715986 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.771214 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.783021 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.810682 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.823841 4806 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.824114 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://73c291f28f9e5d9e90a3c8c2571e1f5d971573ae8724eb803be9a74737105f21" gracePeriod=5 Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.860013 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 15:25:21 crc kubenswrapper[4806]: I0217 15:25:21.984482 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.015841 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.240955 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.305673 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.325869 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.366239 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.436833 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.452091 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.531464 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.650144 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.751228 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.930539 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 15:25:22 crc kubenswrapper[4806]: I0217 15:25:22.955865 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 15:25:23 crc kubenswrapper[4806]: I0217 15:25:23.107756 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 15:25:23 crc kubenswrapper[4806]: I0217 15:25:23.643204 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 15:25:23 crc kubenswrapper[4806]: I0217 15:25:23.691353 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 15:25:23 crc kubenswrapper[4806]: I0217 15:25:23.889305 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 15:25:23 crc kubenswrapper[4806]: I0217 15:25:23.999673 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.037445 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.074810 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.198878 4806 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.249389 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.349252 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.481226 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.865352 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 15:25:24 crc kubenswrapper[4806]: I0217 15:25:24.943565 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 15:25:25 crc kubenswrapper[4806]: I0217 15:25:25.150105 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 15:25:25 crc kubenswrapper[4806]: I0217 15:25:25.351564 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 15:25:25 crc kubenswrapper[4806]: I0217 15:25:25.371119 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.393238 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.393738 4806 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="73c291f28f9e5d9e90a3c8c2571e1f5d971573ae8724eb803be9a74737105f21" exitCode=137 Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.393798 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc751d180edabd43a05b317616ef580d419e1608f6081b23dbe714b420eb729" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.424974 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.425066 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.581307 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.581897 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.582118 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.582348 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.582652 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.581464 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.582019 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.582499 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.582691 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.583524 4806 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.583741 4806 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.583959 4806 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.584162 4806 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.596853 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:25:27 crc kubenswrapper[4806]: I0217 15:25:27.686354 4806 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:28 crc kubenswrapper[4806]: I0217 15:25:28.399313 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 15:25:29 crc kubenswrapper[4806]: I0217 15:25:29.173735 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 15:25:29 crc kubenswrapper[4806]: I0217 15:25:29.174858 4806 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 17 15:25:29 crc kubenswrapper[4806]: I0217 15:25:29.191493 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 15:25:29 crc kubenswrapper[4806]: I0217 15:25:29.191605 4806 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="60a66511-4008-4c71-bd5b-9b38ca54ae38" Feb 17 15:25:29 crc kubenswrapper[4806]: I0217 15:25:29.195855 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 15:25:29 crc kubenswrapper[4806]: I0217 15:25:29.196016 4806 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="60a66511-4008-4c71-bd5b-9b38ca54ae38" Feb 17 15:25:38 crc kubenswrapper[4806]: I0217 15:25:38.972676 4806 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.138216 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8lcn2"] Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.139032 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" podUID="c9d54745-0a0c-436a-8ead-26184660d59c" containerName="controller-manager" containerID="cri-o://3a16e68c2cdfe7061b0f82fde54eaba8f4f59db28cb1cff4f3ad065df0f65a11" gracePeriod=30 Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.154044 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6"] Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.154667 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" podUID="7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" containerName="route-controller-manager" containerID="cri-o://7872f25bf5a389c2766d24e2c52cb0177d4107f9ebcd13052b41da528708a628" gracePeriod=30 Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.479661 4806 generic.go:334] "Generic (PLEG): container finished" podID="c9d54745-0a0c-436a-8ead-26184660d59c" containerID="3a16e68c2cdfe7061b0f82fde54eaba8f4f59db28cb1cff4f3ad065df0f65a11" exitCode=0 Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.480172 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" event={"ID":"c9d54745-0a0c-436a-8ead-26184660d59c","Type":"ContainerDied","Data":"3a16e68c2cdfe7061b0f82fde54eaba8f4f59db28cb1cff4f3ad065df0f65a11"} Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.481594 4806 generic.go:334] "Generic (PLEG): container finished" podID="7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" containerID="7872f25bf5a389c2766d24e2c52cb0177d4107f9ebcd13052b41da528708a628" exitCode=0 Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.481625 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" event={"ID":"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7","Type":"ContainerDied","Data":"7872f25bf5a389c2766d24e2c52cb0177d4107f9ebcd13052b41da528708a628"} Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.543823 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.547061 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.711743 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-proxy-ca-bundles\") pod \"c9d54745-0a0c-436a-8ead-26184660d59c\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.712233 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-client-ca\") pod \"c9d54745-0a0c-436a-8ead-26184660d59c\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.712386 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdnmh\" (UniqueName: \"kubernetes.io/projected/c9d54745-0a0c-436a-8ead-26184660d59c-kube-api-access-zdnmh\") pod \"c9d54745-0a0c-436a-8ead-26184660d59c\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.712541 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-config\") pod \"c9d54745-0a0c-436a-8ead-26184660d59c\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.712654 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-serving-cert\") pod \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.712745 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c9d54745-0a0c-436a-8ead-26184660d59c" (UID: "c9d54745-0a0c-436a-8ead-26184660d59c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.712784 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fp4h\" (UniqueName: \"kubernetes.io/projected/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-kube-api-access-6fp4h\") pod \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.712971 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d54745-0a0c-436a-8ead-26184660d59c-serving-cert\") pod \"c9d54745-0a0c-436a-8ead-26184660d59c\" (UID: \"c9d54745-0a0c-436a-8ead-26184660d59c\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.713081 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-config\") pod \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.713236 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-client-ca\") pod \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\" (UID: \"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7\") " Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.713327 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-config" (OuterVolumeSpecName: "config") pod "c9d54745-0a0c-436a-8ead-26184660d59c" (UID: "c9d54745-0a0c-436a-8ead-26184660d59c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.713717 4806 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.713813 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.713886 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-client-ca" (OuterVolumeSpecName: "client-ca") pod "c9d54745-0a0c-436a-8ead-26184660d59c" (UID: "c9d54745-0a0c-436a-8ead-26184660d59c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.714165 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-client-ca" (OuterVolumeSpecName: "client-ca") pod "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" (UID: "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.714254 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-config" (OuterVolumeSpecName: "config") pod "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" (UID: "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.719255 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d54745-0a0c-436a-8ead-26184660d59c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c9d54745-0a0c-436a-8ead-26184660d59c" (UID: "c9d54745-0a0c-436a-8ead-26184660d59c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.719279 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-kube-api-access-6fp4h" (OuterVolumeSpecName: "kube-api-access-6fp4h") pod "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" (UID: "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7"). InnerVolumeSpecName "kube-api-access-6fp4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.719333 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" (UID: "7bc4dfb4-7e4d-42d1-957d-f50b38556bf7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.719820 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d54745-0a0c-436a-8ead-26184660d59c-kube-api-access-zdnmh" (OuterVolumeSpecName: "kube-api-access-zdnmh") pod "c9d54745-0a0c-436a-8ead-26184660d59c" (UID: "c9d54745-0a0c-436a-8ead-26184660d59c"). InnerVolumeSpecName "kube-api-access-zdnmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.814870 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d54745-0a0c-436a-8ead-26184660d59c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.814936 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.814953 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.814968 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9d54745-0a0c-436a-8ead-26184660d59c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.814984 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdnmh\" (UniqueName: \"kubernetes.io/projected/c9d54745-0a0c-436a-8ead-26184660d59c-kube-api-access-zdnmh\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.815001 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:41 crc kubenswrapper[4806]: I0217 15:25:41.815019 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fp4h\" (UniqueName: \"kubernetes.io/projected/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7-kube-api-access-6fp4h\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032373 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb"] Feb 17 15:25:42 crc kubenswrapper[4806]: E0217 15:25:42.032610 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d54745-0a0c-436a-8ead-26184660d59c" containerName="controller-manager" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032623 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d54745-0a0c-436a-8ead-26184660d59c" containerName="controller-manager" Feb 17 15:25:42 crc kubenswrapper[4806]: E0217 15:25:42.032637 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" containerName="installer" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032644 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" containerName="installer" Feb 17 15:25:42 crc kubenswrapper[4806]: E0217 15:25:42.032650 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032657 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 15:25:42 crc kubenswrapper[4806]: E0217 15:25:42.032664 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" containerName="route-controller-manager" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032671 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" containerName="route-controller-manager" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032765 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" containerName="route-controller-manager" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032778 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d54745-0a0c-436a-8ead-26184660d59c" containerName="controller-manager" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032786 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.032792 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf0e72c-ef97-46e0-9e67-044d1f893320" containerName="installer" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.033161 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.041670 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.042307 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.047969 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.068832 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.118094 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-client-ca\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.118153 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-config\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.118187 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe620c9d-e187-4cc3-960b-833fbbaaf879-serving-cert\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.118221 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-proxy-ca-bundles\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.118375 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9nk2\" (UniqueName: \"kubernetes.io/projected/fe620c9d-e187-4cc3-960b-833fbbaaf879-kube-api-access-b9nk2\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219283 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-config\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219350 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe620c9d-e187-4cc3-960b-833fbbaaf879-serving-cert\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219372 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-proxy-ca-bundles\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219392 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-serving-cert\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219430 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgnr\" (UniqueName: \"kubernetes.io/projected/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-kube-api-access-clgnr\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219461 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9nk2\" (UniqueName: \"kubernetes.io/projected/fe620c9d-e187-4cc3-960b-833fbbaaf879-kube-api-access-b9nk2\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219511 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-client-ca\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219529 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-client-ca\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.219548 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-config\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.220821 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-client-ca\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.220943 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-proxy-ca-bundles\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.221042 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-config\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.223468 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe620c9d-e187-4cc3-960b-833fbbaaf879-serving-cert\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.235986 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9nk2\" (UniqueName: \"kubernetes.io/projected/fe620c9d-e187-4cc3-960b-833fbbaaf879-kube-api-access-b9nk2\") pod \"controller-manager-7bc7bd46c7-4c2bb\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.283142 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.283602 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.321251 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-client-ca\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.321327 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-config\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.321369 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-serving-cert\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.321395 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgnr\" (UniqueName: \"kubernetes.io/projected/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-kube-api-access-clgnr\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.323380 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-client-ca\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.323639 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-config\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.329622 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-serving-cert\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.343707 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6"] Feb 17 15:25:42 crc kubenswrapper[4806]: E0217 15:25:42.344079 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-clgnr], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" podUID="6da9dee7-b44e-4481-a0b3-45ef6d8330c3" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.359993 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgnr\" (UniqueName: \"kubernetes.io/projected/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-kube-api-access-clgnr\") pod \"route-controller-manager-847d9569f-x5tx6\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.491046 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" event={"ID":"c9d54745-0a0c-436a-8ead-26184660d59c","Type":"ContainerDied","Data":"6c56167c2bca3531e42aa4b24d3bd26e6a22f5cc5b64fdce4da10a79835df054"} Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.491394 4806 scope.go:117] "RemoveContainer" containerID="3a16e68c2cdfe7061b0f82fde54eaba8f4f59db28cb1cff4f3ad065df0f65a11" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.491539 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8lcn2" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.491761 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.493789 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.494196 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.494331 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6" event={"ID":"7bc4dfb4-7e4d-42d1-957d-f50b38556bf7","Type":"ContainerDied","Data":"e7de5cd88b420e0d022866ed651d11216231108d1f5256eca0e23eead152a732"} Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.518246 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.536244 4806 scope.go:117] "RemoveContainer" containerID="7872f25bf5a389c2766d24e2c52cb0177d4107f9ebcd13052b41da528708a628" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.540117 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.547482 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rbxx6"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.551177 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8lcn2"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.555109 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8lcn2"] Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.625572 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgnr\" (UniqueName: \"kubernetes.io/projected/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-kube-api-access-clgnr\") pod \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.625813 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-serving-cert\") pod \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.625866 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-config\") pod \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.625941 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-client-ca\") pod \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\" (UID: \"6da9dee7-b44e-4481-a0b3-45ef6d8330c3\") " Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.626647 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "6da9dee7-b44e-4481-a0b3-45ef6d8330c3" (UID: "6da9dee7-b44e-4481-a0b3-45ef6d8330c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.626994 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-config" (OuterVolumeSpecName: "config") pod "6da9dee7-b44e-4481-a0b3-45ef6d8330c3" (UID: "6da9dee7-b44e-4481-a0b3-45ef6d8330c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.629284 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6da9dee7-b44e-4481-a0b3-45ef6d8330c3" (UID: "6da9dee7-b44e-4481-a0b3-45ef6d8330c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.629586 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-kube-api-access-clgnr" (OuterVolumeSpecName: "kube-api-access-clgnr") pod "6da9dee7-b44e-4481-a0b3-45ef6d8330c3" (UID: "6da9dee7-b44e-4481-a0b3-45ef6d8330c3"). InnerVolumeSpecName "kube-api-access-clgnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.727541 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.727596 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.727617 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:42 crc kubenswrapper[4806]: I0217 15:25:42.727636 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clgnr\" (UniqueName: \"kubernetes.io/projected/6da9dee7-b44e-4481-a0b3-45ef6d8330c3-kube-api-access-clgnr\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.174054 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc4dfb4-7e4d-42d1-957d-f50b38556bf7" path="/var/lib/kubelet/pods/7bc4dfb4-7e4d-42d1-957d-f50b38556bf7/volumes" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.176239 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d54745-0a0c-436a-8ead-26184660d59c" path="/var/lib/kubelet/pods/c9d54745-0a0c-436a-8ead-26184660d59c/volumes" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.503634 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" event={"ID":"fe620c9d-e187-4cc3-960b-833fbbaaf879","Type":"ContainerStarted","Data":"9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6"} Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.503725 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" podUID="fe620c9d-e187-4cc3-960b-833fbbaaf879" containerName="controller-manager" containerID="cri-o://9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6" gracePeriod=30 Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.503735 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" event={"ID":"fe620c9d-e187-4cc3-960b-833fbbaaf879","Type":"ContainerStarted","Data":"580fddb654a4140c261f336dee501cb90f3bdea423803a1d86fd3ba8ea399f8f"} Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.503924 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.508321 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.524243 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.529307 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" podStartSLOduration=1.5292822780000002 podStartE2EDuration="1.529282278s" podCreationTimestamp="2026-02-17 15:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:25:43.526865501 +0000 UTC m=+305.057495932" watchObservedRunningTime="2026-02-17 15:25:43.529282278 +0000 UTC m=+305.059912699" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.566227 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm"] Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.567358 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.569190 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.571462 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.571590 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6"] Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.571862 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.571892 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.572152 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.572173 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.575318 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-x5tx6"] Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.587944 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm"] Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.743167 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpb8r\" (UniqueName: \"kubernetes.io/projected/c96154dd-da6f-45cf-8d96-f3608c0f124a-kube-api-access-cpb8r\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.743210 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96154dd-da6f-45cf-8d96-f3608c0f124a-serving-cert\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.743379 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-config\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.743554 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-client-ca\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.844968 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-config\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.845016 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-client-ca\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.845053 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpb8r\" (UniqueName: \"kubernetes.io/projected/c96154dd-da6f-45cf-8d96-f3608c0f124a-kube-api-access-cpb8r\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.845072 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96154dd-da6f-45cf-8d96-f3608c0f124a-serving-cert\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.846753 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-client-ca\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.847220 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-config\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.853539 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96154dd-da6f-45cf-8d96-f3608c0f124a-serving-cert\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.865365 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpb8r\" (UniqueName: \"kubernetes.io/projected/c96154dd-da6f-45cf-8d96-f3608c0f124a-kube-api-access-cpb8r\") pod \"route-controller-manager-644c66787f-2xtcm\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.911585 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:43 crc kubenswrapper[4806]: I0217 15:25:43.965859 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.046642 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-config\") pod \"fe620c9d-e187-4cc3-960b-833fbbaaf879\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.046703 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-client-ca\") pod \"fe620c9d-e187-4cc3-960b-833fbbaaf879\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.046785 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe620c9d-e187-4cc3-960b-833fbbaaf879-serving-cert\") pod \"fe620c9d-e187-4cc3-960b-833fbbaaf879\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.046823 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9nk2\" (UniqueName: \"kubernetes.io/projected/fe620c9d-e187-4cc3-960b-833fbbaaf879-kube-api-access-b9nk2\") pod \"fe620c9d-e187-4cc3-960b-833fbbaaf879\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.046885 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-proxy-ca-bundles\") pod \"fe620c9d-e187-4cc3-960b-833fbbaaf879\" (UID: \"fe620c9d-e187-4cc3-960b-833fbbaaf879\") " Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.047732 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-client-ca" (OuterVolumeSpecName: "client-ca") pod "fe620c9d-e187-4cc3-960b-833fbbaaf879" (UID: "fe620c9d-e187-4cc3-960b-833fbbaaf879"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.047866 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-config" (OuterVolumeSpecName: "config") pod "fe620c9d-e187-4cc3-960b-833fbbaaf879" (UID: "fe620c9d-e187-4cc3-960b-833fbbaaf879"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.047941 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fe620c9d-e187-4cc3-960b-833fbbaaf879" (UID: "fe620c9d-e187-4cc3-960b-833fbbaaf879"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.051182 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe620c9d-e187-4cc3-960b-833fbbaaf879-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fe620c9d-e187-4cc3-960b-833fbbaaf879" (UID: "fe620c9d-e187-4cc3-960b-833fbbaaf879"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.053024 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe620c9d-e187-4cc3-960b-833fbbaaf879-kube-api-access-b9nk2" (OuterVolumeSpecName: "kube-api-access-b9nk2") pod "fe620c9d-e187-4cc3-960b-833fbbaaf879" (UID: "fe620c9d-e187-4cc3-960b-833fbbaaf879"). InnerVolumeSpecName "kube-api-access-b9nk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.109942 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm"] Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.149208 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.149238 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.149248 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe620c9d-e187-4cc3-960b-833fbbaaf879-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.149258 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9nk2\" (UniqueName: \"kubernetes.io/projected/fe620c9d-e187-4cc3-960b-833fbbaaf879-kube-api-access-b9nk2\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.149272 4806 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe620c9d-e187-4cc3-960b-833fbbaaf879-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.214320 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm"] Feb 17 15:25:44 crc kubenswrapper[4806]: W0217 15:25:44.230004 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc96154dd_da6f_45cf_8d96_f3608c0f124a.slice/crio-2102ec9d602fea8fe21bb049cb9bc5d3dede8e48b82c4b04bbb7e3646a4f0322 WatchSource:0}: Error finding container 2102ec9d602fea8fe21bb049cb9bc5d3dede8e48b82c4b04bbb7e3646a4f0322: Status 404 returned error can't find the container with id 2102ec9d602fea8fe21bb049cb9bc5d3dede8e48b82c4b04bbb7e3646a4f0322 Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.518552 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" event={"ID":"c96154dd-da6f-45cf-8d96-f3608c0f124a","Type":"ContainerStarted","Data":"e55a5b76bc5aaecacc226053816c164729bb093051b6f9c3cadf2064ddb850f2"} Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.519157 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" event={"ID":"c96154dd-da6f-45cf-8d96-f3608c0f124a","Type":"ContainerStarted","Data":"2102ec9d602fea8fe21bb049cb9bc5d3dede8e48b82c4b04bbb7e3646a4f0322"} Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.519205 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.518587 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" podUID="c96154dd-da6f-45cf-8d96-f3608c0f124a" containerName="route-controller-manager" containerID="cri-o://e55a5b76bc5aaecacc226053816c164729bb093051b6f9c3cadf2064ddb850f2" gracePeriod=30 Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.520426 4806 generic.go:334] "Generic (PLEG): container finished" podID="fe620c9d-e187-4cc3-960b-833fbbaaf879" containerID="9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6" exitCode=0 Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.520460 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" event={"ID":"fe620c9d-e187-4cc3-960b-833fbbaaf879","Type":"ContainerDied","Data":"9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6"} Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.520488 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" event={"ID":"fe620c9d-e187-4cc3-960b-833fbbaaf879","Type":"ContainerDied","Data":"580fddb654a4140c261f336dee501cb90f3bdea423803a1d86fd3ba8ea399f8f"} Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.520508 4806 scope.go:117] "RemoveContainer" containerID="9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.520619 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.549073 4806 scope.go:117] "RemoveContainer" containerID="9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6" Feb 17 15:25:44 crc kubenswrapper[4806]: E0217 15:25:44.550904 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6\": container with ID starting with 9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6 not found: ID does not exist" containerID="9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.550994 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6"} err="failed to get container status \"9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6\": rpc error: code = NotFound desc = could not find container \"9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6\": container with ID starting with 9ea52de74b5c1434aba3435ea8fd9279bb79068dc0db3d7f1d86953621340aa6 not found: ID does not exist" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.555618 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" podStartSLOduration=2.555593661 podStartE2EDuration="2.555593661s" podCreationTimestamp="2026-02-17 15:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:25:44.543080923 +0000 UTC m=+306.073711414" watchObservedRunningTime="2026-02-17 15:25:44.555593661 +0000 UTC m=+306.086224222" Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.570381 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb"] Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.581104 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-4c2bb"] Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.909531 4806 patch_prober.go:28] interesting pod/route-controller-manager-644c66787f-2xtcm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": read tcp 10.217.0.2:45266->10.217.0.65:8443: read: connection reset by peer" start-of-body= Feb 17 15:25:44 crc kubenswrapper[4806]: I0217 15:25:44.909591 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" podUID="c96154dd-da6f-45cf-8d96-f3608c0f124a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": read tcp 10.217.0.2:45266->10.217.0.65:8443: read: connection reset by peer" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.172749 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da9dee7-b44e-4481-a0b3-45ef6d8330c3" path="/var/lib/kubelet/pods/6da9dee7-b44e-4481-a0b3-45ef6d8330c3/volumes" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.173568 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe620c9d-e187-4cc3-960b-833fbbaaf879" path="/var/lib/kubelet/pods/fe620c9d-e187-4cc3-960b-833fbbaaf879/volumes" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.530557 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-644c66787f-2xtcm_c96154dd-da6f-45cf-8d96-f3608c0f124a/route-controller-manager/0.log" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.530653 4806 generic.go:334] "Generic (PLEG): container finished" podID="c96154dd-da6f-45cf-8d96-f3608c0f124a" containerID="e55a5b76bc5aaecacc226053816c164729bb093051b6f9c3cadf2064ddb850f2" exitCode=255 Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.530794 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" event={"ID":"c96154dd-da6f-45cf-8d96-f3608c0f124a","Type":"ContainerDied","Data":"e55a5b76bc5aaecacc226053816c164729bb093051b6f9c3cadf2064ddb850f2"} Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.763069 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-644c66787f-2xtcm_c96154dd-da6f-45cf-8d96-f3608c0f124a/route-controller-manager/0.log" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.763536 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.812387 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l"] Feb 17 15:25:45 crc kubenswrapper[4806]: E0217 15:25:45.812941 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96154dd-da6f-45cf-8d96-f3608c0f124a" containerName="route-controller-manager" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.812968 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96154dd-da6f-45cf-8d96-f3608c0f124a" containerName="route-controller-manager" Feb 17 15:25:45 crc kubenswrapper[4806]: E0217 15:25:45.813009 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe620c9d-e187-4cc3-960b-833fbbaaf879" containerName="controller-manager" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.813023 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe620c9d-e187-4cc3-960b-833fbbaaf879" containerName="controller-manager" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.813206 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe620c9d-e187-4cc3-960b-833fbbaaf879" containerName="controller-manager" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.813233 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96154dd-da6f-45cf-8d96-f3608c0f124a" containerName="route-controller-manager" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.814021 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.816137 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l"] Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.841871 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5494bb8567-98kj4"] Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.842784 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.845187 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.845705 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.845843 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.845971 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.845763 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.847848 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.862215 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.870203 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5494bb8567-98kj4"] Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.887085 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96154dd-da6f-45cf-8d96-f3608c0f124a-serving-cert\") pod \"c96154dd-da6f-45cf-8d96-f3608c0f124a\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.888376 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-config\") pod \"c96154dd-da6f-45cf-8d96-f3608c0f124a\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.888692 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpb8r\" (UniqueName: \"kubernetes.io/projected/c96154dd-da6f-45cf-8d96-f3608c0f124a-kube-api-access-cpb8r\") pod \"c96154dd-da6f-45cf-8d96-f3608c0f124a\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.889269 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-client-ca\") pod \"c96154dd-da6f-45cf-8d96-f3608c0f124a\" (UID: \"c96154dd-da6f-45cf-8d96-f3608c0f124a\") " Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.889521 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f49e3ab-863d-4702-b07d-acb7053c7d95-serving-cert\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.889633 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-client-ca\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.889655 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-client-ca" (OuterVolumeSpecName: "client-ca") pod "c96154dd-da6f-45cf-8d96-f3608c0f124a" (UID: "c96154dd-da6f-45cf-8d96-f3608c0f124a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.889904 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr897\" (UniqueName: \"kubernetes.io/projected/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-kube-api-access-pr897\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890330 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-config\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890535 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-config" (OuterVolumeSpecName: "config") pod "c96154dd-da6f-45cf-8d96-f3608c0f124a" (UID: "c96154dd-da6f-45cf-8d96-f3608c0f124a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890527 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-proxy-ca-bundles\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890622 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdrl\" (UniqueName: \"kubernetes.io/projected/8f49e3ab-863d-4702-b07d-acb7053c7d95-kube-api-access-wrdrl\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890669 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-config\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890744 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-serving-cert\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890796 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-client-ca\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890901 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.890931 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c96154dd-da6f-45cf-8d96-f3608c0f124a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.893967 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c96154dd-da6f-45cf-8d96-f3608c0f124a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c96154dd-da6f-45cf-8d96-f3608c0f124a" (UID: "c96154dd-da6f-45cf-8d96-f3608c0f124a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.896766 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96154dd-da6f-45cf-8d96-f3608c0f124a-kube-api-access-cpb8r" (OuterVolumeSpecName: "kube-api-access-cpb8r") pod "c96154dd-da6f-45cf-8d96-f3608c0f124a" (UID: "c96154dd-da6f-45cf-8d96-f3608c0f124a"). InnerVolumeSpecName "kube-api-access-cpb8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991649 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr897\" (UniqueName: \"kubernetes.io/projected/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-kube-api-access-pr897\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991704 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-config\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991723 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-proxy-ca-bundles\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991742 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdrl\" (UniqueName: \"kubernetes.io/projected/8f49e3ab-863d-4702-b07d-acb7053c7d95-kube-api-access-wrdrl\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991758 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-config\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991785 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-client-ca\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991800 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-serving-cert\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991831 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f49e3ab-863d-4702-b07d-acb7053c7d95-serving-cert\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991868 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-client-ca\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991916 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpb8r\" (UniqueName: \"kubernetes.io/projected/c96154dd-da6f-45cf-8d96-f3608c0f124a-kube-api-access-cpb8r\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.991928 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c96154dd-da6f-45cf-8d96-f3608c0f124a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.992838 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-client-ca\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.993618 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-proxy-ca-bundles\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.993824 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-client-ca\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.994601 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-config\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:45 crc kubenswrapper[4806]: I0217 15:25:45.994848 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-config\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:45.996990 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f49e3ab-863d-4702-b07d-acb7053c7d95-serving-cert\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:45.997055 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-serving-cert\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.011693 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr897\" (UniqueName: \"kubernetes.io/projected/58f05ff6-49e7-4d63-80c8-fe7838b7dbc9-kube-api-access-pr897\") pod \"route-controller-manager-847d9569f-2p86l\" (UID: \"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9\") " pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.014155 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdrl\" (UniqueName: \"kubernetes.io/projected/8f49e3ab-863d-4702-b07d-acb7053c7d95-kube-api-access-wrdrl\") pod \"controller-manager-5494bb8567-98kj4\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.142042 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.171973 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.426131 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5494bb8567-98kj4"] Feb 17 15:25:46 crc kubenswrapper[4806]: W0217 15:25:46.441892 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f49e3ab_863d_4702_b07d_acb7053c7d95.slice/crio-2f2c2b99d549734d2bf3e79fb75d7bde3e1bd9fd4a8a3673360a27f63932c462 WatchSource:0}: Error finding container 2f2c2b99d549734d2bf3e79fb75d7bde3e1bd9fd4a8a3673360a27f63932c462: Status 404 returned error can't find the container with id 2f2c2b99d549734d2bf3e79fb75d7bde3e1bd9fd4a8a3673360a27f63932c462 Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.462451 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l"] Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.539773 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" event={"ID":"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9","Type":"ContainerStarted","Data":"88d1a1799603df7f126329dfc1d993c03863f5bf888ccccad94f72c3dfb8774f"} Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.541338 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-644c66787f-2xtcm_c96154dd-da6f-45cf-8d96-f3608c0f124a/route-controller-manager/0.log" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.541515 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" event={"ID":"c96154dd-da6f-45cf-8d96-f3608c0f124a","Type":"ContainerDied","Data":"2102ec9d602fea8fe21bb049cb9bc5d3dede8e48b82c4b04bbb7e3646a4f0322"} Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.541558 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.541573 4806 scope.go:117] "RemoveContainer" containerID="e55a5b76bc5aaecacc226053816c164729bb093051b6f9c3cadf2064ddb850f2" Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.544681 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" event={"ID":"8f49e3ab-863d-4702-b07d-acb7053c7d95","Type":"ContainerStarted","Data":"2f2c2b99d549734d2bf3e79fb75d7bde3e1bd9fd4a8a3673360a27f63932c462"} Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.581487 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm"] Feb 17 15:25:46 crc kubenswrapper[4806]: I0217 15:25:46.585107 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644c66787f-2xtcm"] Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.167909 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96154dd-da6f-45cf-8d96-f3608c0f124a" path="/var/lib/kubelet/pods/c96154dd-da6f-45cf-8d96-f3608c0f124a/volumes" Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.554898 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" event={"ID":"58f05ff6-49e7-4d63-80c8-fe7838b7dbc9","Type":"ContainerStarted","Data":"b791f4f13923bea522152d0012930cdf82c1e543a3c5882f432426d0d9f6f19d"} Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.555380 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.556741 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" event={"ID":"8f49e3ab-863d-4702-b07d-acb7053c7d95","Type":"ContainerStarted","Data":"631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54"} Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.556831 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.567172 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.567636 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.581748 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-847d9569f-2p86l" podStartSLOduration=3.581715761 podStartE2EDuration="3.581715761s" podCreationTimestamp="2026-02-17 15:25:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:25:47.579314014 +0000 UTC m=+309.109944445" watchObservedRunningTime="2026-02-17 15:25:47.581715761 +0000 UTC m=+309.112346182" Feb 17 15:25:47 crc kubenswrapper[4806]: I0217 15:25:47.625957 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" podStartSLOduration=5.625942112 podStartE2EDuration="5.625942112s" podCreationTimestamp="2026-02-17 15:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:25:47.623933145 +0000 UTC m=+309.154563566" watchObservedRunningTime="2026-02-17 15:25:47.625942112 +0000 UTC m=+309.156572523" Feb 17 15:25:52 crc kubenswrapper[4806]: I0217 15:25:52.393130 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.105353 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mcgws"] Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.108842 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.172258 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mcgws"] Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.194266 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5494bb8567-98kj4"] Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.194796 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" podUID="8f49e3ab-863d-4702-b07d-acb7053c7d95" containerName="controller-manager" containerID="cri-o://631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54" gracePeriod=30 Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254702 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254766 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-bound-sa-token\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254789 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-trusted-ca\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254826 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254849 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-registry-tls\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254883 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254907 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnd78\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-kube-api-access-pnd78\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.254936 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-registry-certificates\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.286953 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.356181 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-registry-certificates\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.356253 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-bound-sa-token\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.356275 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-trusted-ca\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.356321 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.356367 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-registry-tls\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.356421 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.356452 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnd78\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-kube-api-access-pnd78\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.357162 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.357779 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-trusted-ca\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.358030 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-registry-certificates\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.362951 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-registry-tls\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.365912 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.373306 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnd78\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-kube-api-access-pnd78\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.380753 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccbced01-981b-4f7a-b6b9-38b86cc0b9d0-bound-sa-token\") pod \"image-registry-66df7c8f76-mcgws\" (UID: \"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0\") " pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.478174 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.618484 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.732796 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mcgws"] Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.762246 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-proxy-ca-bundles\") pod \"8f49e3ab-863d-4702-b07d-acb7053c7d95\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.762323 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f49e3ab-863d-4702-b07d-acb7053c7d95-serving-cert\") pod \"8f49e3ab-863d-4702-b07d-acb7053c7d95\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.762451 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-client-ca\") pod \"8f49e3ab-863d-4702-b07d-acb7053c7d95\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.762501 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-config\") pod \"8f49e3ab-863d-4702-b07d-acb7053c7d95\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.762536 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrdrl\" (UniqueName: \"kubernetes.io/projected/8f49e3ab-863d-4702-b07d-acb7053c7d95-kube-api-access-wrdrl\") pod \"8f49e3ab-863d-4702-b07d-acb7053c7d95\" (UID: \"8f49e3ab-863d-4702-b07d-acb7053c7d95\") " Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.770468 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f49e3ab-863d-4702-b07d-acb7053c7d95" (UID: "8f49e3ab-863d-4702-b07d-acb7053c7d95"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.771125 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-config" (OuterVolumeSpecName: "config") pod "8f49e3ab-863d-4702-b07d-acb7053c7d95" (UID: "8f49e3ab-863d-4702-b07d-acb7053c7d95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.778490 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f49e3ab-863d-4702-b07d-acb7053c7d95-kube-api-access-wrdrl" (OuterVolumeSpecName: "kube-api-access-wrdrl") pod "8f49e3ab-863d-4702-b07d-acb7053c7d95" (UID: "8f49e3ab-863d-4702-b07d-acb7053c7d95"). InnerVolumeSpecName "kube-api-access-wrdrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.784706 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8f49e3ab-863d-4702-b07d-acb7053c7d95" (UID: "8f49e3ab-863d-4702-b07d-acb7053c7d95"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.785261 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f49e3ab-863d-4702-b07d-acb7053c7d95-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f49e3ab-863d-4702-b07d-acb7053c7d95" (UID: "8f49e3ab-863d-4702-b07d-acb7053c7d95"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.794865 4806 generic.go:334] "Generic (PLEG): container finished" podID="8f49e3ab-863d-4702-b07d-acb7053c7d95" containerID="631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54" exitCode=0 Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.794964 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" event={"ID":"8f49e3ab-863d-4702-b07d-acb7053c7d95","Type":"ContainerDied","Data":"631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54"} Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.795002 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" event={"ID":"8f49e3ab-863d-4702-b07d-acb7053c7d95","Type":"ContainerDied","Data":"2f2c2b99d549734d2bf3e79fb75d7bde3e1bd9fd4a8a3673360a27f63932c462"} Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.795025 4806 scope.go:117] "RemoveContainer" containerID="631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.795391 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5494bb8567-98kj4" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.799450 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" event={"ID":"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0","Type":"ContainerStarted","Data":"5d30aa0114f79f7f04896af37e7cb31330a177d1d233e6cb5b599f84e7fa3824"} Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.819806 4806 scope.go:117] "RemoveContainer" containerID="631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54" Feb 17 15:26:21 crc kubenswrapper[4806]: E0217 15:26:21.820226 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54\": container with ID starting with 631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54 not found: ID does not exist" containerID="631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.820255 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54"} err="failed to get container status \"631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54\": rpc error: code = NotFound desc = could not find container \"631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54\": container with ID starting with 631aafd01f06be92c437844da88fdbfc4d1d346341a9746fd412db3c1ee86b54 not found: ID does not exist" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.827288 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5494bb8567-98kj4"] Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.829988 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5494bb8567-98kj4"] Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.866718 4806 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.867506 4806 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f49e3ab-863d-4702-b07d-acb7053c7d95-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.867522 4806 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.867609 4806 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f49e3ab-863d-4702-b07d-acb7053c7d95-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:21 crc kubenswrapper[4806]: I0217 15:26:21.867625 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrdrl\" (UniqueName: \"kubernetes.io/projected/8f49e3ab-863d-4702-b07d-acb7053c7d95-kube-api-access-wrdrl\") on node \"crc\" DevicePath \"\"" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.811198 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" event={"ID":"ccbced01-981b-4f7a-b6b9-38b86cc0b9d0","Type":"ContainerStarted","Data":"1c1c7c1c3217d943f2175312c4cb120f9552794fba0f73ba444f6b2f7fd66b3a"} Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.811398 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.848363 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" podStartSLOduration=1.848344771 podStartE2EDuration="1.848344771s" podCreationTimestamp="2026-02-17 15:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:26:22.843335477 +0000 UTC m=+344.373965898" watchObservedRunningTime="2026-02-17 15:26:22.848344771 +0000 UTC m=+344.378975192" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.875849 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz"] Feb 17 15:26:22 crc kubenswrapper[4806]: E0217 15:26:22.876316 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f49e3ab-863d-4702-b07d-acb7053c7d95" containerName="controller-manager" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.876343 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f49e3ab-863d-4702-b07d-acb7053c7d95" containerName="controller-manager" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.876511 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f49e3ab-863d-4702-b07d-acb7053c7d95" containerName="controller-manager" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.877238 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.880940 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.881639 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.881905 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.882150 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.882825 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.883131 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.887609 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz"] Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.893045 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.984871 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45ff31c-8a61-43a2-879f-233bf5d11774-serving-cert\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.984962 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-proxy-ca-bundles\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.985059 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmtv\" (UniqueName: \"kubernetes.io/projected/b45ff31c-8a61-43a2-879f-233bf5d11774-kube-api-access-swmtv\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.985114 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-client-ca\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:22 crc kubenswrapper[4806]: I0217 15:26:22.985155 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-config\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.086308 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45ff31c-8a61-43a2-879f-233bf5d11774-serving-cert\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.086365 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-proxy-ca-bundles\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.086414 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmtv\" (UniqueName: \"kubernetes.io/projected/b45ff31c-8a61-43a2-879f-233bf5d11774-kube-api-access-swmtv\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.086445 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-client-ca\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.086463 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-config\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.087762 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-client-ca\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.087851 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-config\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.088442 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b45ff31c-8a61-43a2-879f-233bf5d11774-proxy-ca-bundles\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.095472 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45ff31c-8a61-43a2-879f-233bf5d11774-serving-cert\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.109804 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmtv\" (UniqueName: \"kubernetes.io/projected/b45ff31c-8a61-43a2-879f-233bf5d11774-kube-api-access-swmtv\") pod \"controller-manager-7bc7bd46c7-47tfz\" (UID: \"b45ff31c-8a61-43a2-879f-233bf5d11774\") " pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.173538 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f49e3ab-863d-4702-b07d-acb7053c7d95" path="/var/lib/kubelet/pods/8f49e3ab-863d-4702-b07d-acb7053c7d95/volumes" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.209199 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.484633 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz"] Feb 17 15:26:23 crc kubenswrapper[4806]: W0217 15:26:23.494678 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45ff31c_8a61_43a2_879f_233bf5d11774.slice/crio-c8a52352932d272b5518109de55e172b8e1e7363c6f498014e660973e32c221f WatchSource:0}: Error finding container c8a52352932d272b5518109de55e172b8e1e7363c6f498014e660973e32c221f: Status 404 returned error can't find the container with id c8a52352932d272b5518109de55e172b8e1e7363c6f498014e660973e32c221f Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.822581 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" event={"ID":"b45ff31c-8a61-43a2-879f-233bf5d11774","Type":"ContainerStarted","Data":"88f06242dcd64293d56e2ac9cd63b23f227de2f6913c0cddbf0062ab171a18fe"} Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.823064 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.823090 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" event={"ID":"b45ff31c-8a61-43a2-879f-233bf5d11774","Type":"ContainerStarted","Data":"c8a52352932d272b5518109de55e172b8e1e7363c6f498014e660973e32c221f"} Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.837080 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" Feb 17 15:26:23 crc kubenswrapper[4806]: I0217 15:26:23.851892 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bc7bd46c7-47tfz" podStartSLOduration=2.851862858 podStartE2EDuration="2.851862858s" podCreationTimestamp="2026-02-17 15:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:26:23.84790257 +0000 UTC m=+345.378533001" watchObservedRunningTime="2026-02-17 15:26:23.851862858 +0000 UTC m=+345.382493309" Feb 17 15:26:34 crc kubenswrapper[4806]: I0217 15:26:34.785139 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:26:34 crc kubenswrapper[4806]: I0217 15:26:34.785985 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:26:41 crc kubenswrapper[4806]: I0217 15:26:41.483648 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mcgws" Feb 17 15:26:41 crc kubenswrapper[4806]: I0217 15:26:41.549873 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8tfm"] Feb 17 15:27:04 crc kubenswrapper[4806]: I0217 15:27:04.785650 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:27:04 crc kubenswrapper[4806]: I0217 15:27:04.786832 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:27:06 crc kubenswrapper[4806]: I0217 15:27:06.605733 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" podUID="32820688-4037-4b80-8a92-9ebe7068d02e" containerName="registry" containerID="cri-o://6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f" gracePeriod=30 Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.025603 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084176 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32820688-4037-4b80-8a92-9ebe7068d02e-installation-pull-secrets\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084245 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-registry-tls\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084315 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-registry-certificates\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084369 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32820688-4037-4b80-8a92-9ebe7068d02e-ca-trust-extracted\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084473 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-bound-sa-token\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084731 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084812 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2c2n\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-kube-api-access-c2c2n\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.084877 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-trusted-ca\") pod \"32820688-4037-4b80-8a92-9ebe7068d02e\" (UID: \"32820688-4037-4b80-8a92-9ebe7068d02e\") " Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.086826 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.087145 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.093099 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.093416 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32820688-4037-4b80-8a92-9ebe7068d02e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.093628 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-kube-api-access-c2c2n" (OuterVolumeSpecName: "kube-api-access-c2c2n") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "kube-api-access-c2c2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.095908 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.100840 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.107307 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32820688-4037-4b80-8a92-9ebe7068d02e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "32820688-4037-4b80-8a92-9ebe7068d02e" (UID: "32820688-4037-4b80-8a92-9ebe7068d02e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.186001 4806 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.186037 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2c2n\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-kube-api-access-c2c2n\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.186049 4806 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.186058 4806 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32820688-4037-4b80-8a92-9ebe7068d02e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.186067 4806 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32820688-4037-4b80-8a92-9ebe7068d02e-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.186078 4806 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32820688-4037-4b80-8a92-9ebe7068d02e-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.186086 4806 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32820688-4037-4b80-8a92-9ebe7068d02e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.241518 4806 generic.go:334] "Generic (PLEG): container finished" podID="32820688-4037-4b80-8a92-9ebe7068d02e" containerID="6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f" exitCode=0 Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.241611 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" event={"ID":"32820688-4037-4b80-8a92-9ebe7068d02e","Type":"ContainerDied","Data":"6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f"} Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.241708 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" event={"ID":"32820688-4037-4b80-8a92-9ebe7068d02e","Type":"ContainerDied","Data":"7bb16b14ca169479b51c8b48af328bec0cd91b7118aff03af3ea2a55b8a181cf"} Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.241743 4806 scope.go:117] "RemoveContainer" containerID="6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.241939 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8tfm" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.265958 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8tfm"] Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.274025 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8tfm"] Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.278037 4806 scope.go:117] "RemoveContainer" containerID="6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f" Feb 17 15:27:07 crc kubenswrapper[4806]: E0217 15:27:07.278725 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f\": container with ID starting with 6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f not found: ID does not exist" containerID="6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f" Feb 17 15:27:07 crc kubenswrapper[4806]: I0217 15:27:07.278786 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f"} err="failed to get container status \"6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f\": rpc error: code = NotFound desc = could not find container \"6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f\": container with ID starting with 6a67c50f49e7614296cd1af109e5ca862d07b516f0dac523739d36263a3dc88f not found: ID does not exist" Feb 17 15:27:09 crc kubenswrapper[4806]: I0217 15:27:09.174964 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32820688-4037-4b80-8a92-9ebe7068d02e" path="/var/lib/kubelet/pods/32820688-4037-4b80-8a92-9ebe7068d02e/volumes" Feb 17 15:27:34 crc kubenswrapper[4806]: I0217 15:27:34.784748 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:27:34 crc kubenswrapper[4806]: I0217 15:27:34.785626 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:27:34 crc kubenswrapper[4806]: I0217 15:27:34.785713 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:27:34 crc kubenswrapper[4806]: I0217 15:27:34.786658 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3c5eb91e36e273fb32d5dc251978788cfa8103bc407e8fe5a600b96d3fdeb1b"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:27:34 crc kubenswrapper[4806]: I0217 15:27:34.786808 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://d3c5eb91e36e273fb32d5dc251978788cfa8103bc407e8fe5a600b96d3fdeb1b" gracePeriod=600 Feb 17 15:27:35 crc kubenswrapper[4806]: I0217 15:27:35.466657 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="d3c5eb91e36e273fb32d5dc251978788cfa8103bc407e8fe5a600b96d3fdeb1b" exitCode=0 Feb 17 15:27:35 crc kubenswrapper[4806]: I0217 15:27:35.466795 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"d3c5eb91e36e273fb32d5dc251978788cfa8103bc407e8fe5a600b96d3fdeb1b"} Feb 17 15:27:35 crc kubenswrapper[4806]: I0217 15:27:35.467559 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"3c9f6cff3a70a1759104fc9fce3a1e5e0b42b5eabe6508edc074aa83565b2162"} Feb 17 15:27:35 crc kubenswrapper[4806]: I0217 15:27:35.467618 4806 scope.go:117] "RemoveContainer" containerID="af4ad8f4c2b632319ef203d1db36470f0e903720950f48aff0223db1d31add79" Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.793817 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2m855"] Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.799975 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-controller" containerID="cri-o://c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" gracePeriod=30 Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.800056 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="nbdb" containerID="cri-o://119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" gracePeriod=30 Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.800112 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" gracePeriod=30 Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.800231 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-node" containerID="cri-o://9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" gracePeriod=30 Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.800260 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="northd" containerID="cri-o://7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" gracePeriod=30 Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.800303 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="sbdb" containerID="cri-o://c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" gracePeriod=30 Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.800357 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-acl-logging" containerID="cri-o://a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" gracePeriod=30 Feb 17 15:29:33 crc kubenswrapper[4806]: I0217 15:29:33.846532 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" containerID="cri-o://7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" gracePeriod=30 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.153962 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/3.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.156891 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovn-acl-logging/0.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.157475 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovn-controller/0.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.158021 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222539 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v8grf"] Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222756 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222771 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222784 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222794 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222804 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222813 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222822 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-node" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222831 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-node" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222841 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-acl-logging" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222849 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-acl-logging" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222861 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kubecfg-setup" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222869 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kubecfg-setup" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222885 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32820688-4037-4b80-8a92-9ebe7068d02e" containerName="registry" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222893 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="32820688-4037-4b80-8a92-9ebe7068d02e" containerName="registry" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222904 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222911 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222923 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="northd" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222931 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="northd" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222941 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="sbdb" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222948 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="sbdb" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222960 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="nbdb" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222968 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="nbdb" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.222980 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.222989 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223093 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="32820688-4037-4b80-8a92-9ebe7068d02e" containerName="registry" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223105 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223118 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-node" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223128 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223137 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223149 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223158 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="northd" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223169 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="nbdb" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223178 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223188 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovn-acl-logging" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223201 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="sbdb" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.223306 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223315 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.223328 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223336 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223475 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.223487 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerName="ovnkube-controller" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.225343 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262203 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-log-socket\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262337 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-env-overrides\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262345 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-log-socket" (OuterVolumeSpecName: "log-socket") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262383 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-netns\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262447 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-etc-openvswitch\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262496 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovn-node-metrics-cert\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262534 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-config\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262572 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-bin\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262584 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262623 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-script-lib\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262662 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262707 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-slash\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262738 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-systemd\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262778 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-netd\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262809 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-ovn\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262868 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk7gh\" (UniqueName: \"kubernetes.io/projected/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-kube-api-access-bk7gh\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262917 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262945 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-ovn-kubernetes\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262956 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-slash" (OuterVolumeSpecName: "host-slash") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262977 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262977 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-var-lib-openvswitch\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.262986 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263020 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-systemd-units\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263039 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-node-log\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263044 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263064 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-openvswitch\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263082 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-kubelet\") pod \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\" (UID: \"1e6a2d66-f11a-48f6-8d86-5295cb917b7f\") " Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263152 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qpbw\" (UniqueName: \"kubernetes.io/projected/caf7ce6c-1324-401e-af06-b7f3cae6b70d-kube-api-access-7qpbw\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263176 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-cni-netd\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263203 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-log-socket\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263222 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-slash\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263238 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-run-netns\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263054 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263209 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263129 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263169 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263196 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-node-log" (OuterVolumeSpecName: "node-log") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263189 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263234 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263096 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263332 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-systemd\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263615 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-var-lib-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263687 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-env-overrides\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263720 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263744 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-kubelet\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263805 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovnkube-config\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263810 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263850 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-systemd-units\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.263969 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovnkube-script-lib\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264056 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-node-log\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264105 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264146 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-cni-bin\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264178 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264253 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264294 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-etc-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264324 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-ovn\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264353 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovn-node-metrics-cert\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264494 4806 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264521 4806 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264541 4806 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264560 4806 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264577 4806 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264595 4806 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264613 4806 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264631 4806 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264649 4806 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264670 4806 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264751 4806 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264793 4806 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264823 4806 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264847 4806 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264873 4806 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264890 4806 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.264906 4806 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.268859 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-kube-api-access-bk7gh" (OuterVolumeSpecName: "kube-api-access-bk7gh") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "kube-api-access-bk7gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.269055 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.275466 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1e6a2d66-f11a-48f6-8d86-5295cb917b7f" (UID: "1e6a2d66-f11a-48f6-8d86-5295cb917b7f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.279328 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/2.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.280275 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/1.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.280337 4806 generic.go:334] "Generic (PLEG): container finished" podID="344f8a87-e00f-4f0a-a0bc-aee197271160" containerID="568e6de38434e4eebd27b387fdedb2cd85a0c8630950783ecdb5f0697f6d7faa" exitCode=2 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.280472 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerDied","Data":"568e6de38434e4eebd27b387fdedb2cd85a0c8630950783ecdb5f0697f6d7faa"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.280558 4806 scope.go:117] "RemoveContainer" containerID="4a0181687ccf9dfb8b31b509289e447edb7400c090cd603c1ecbefdd0fab6735" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.281504 4806 scope.go:117] "RemoveContainer" containerID="568e6de38434e4eebd27b387fdedb2cd85a0c8630950783ecdb5f0697f6d7faa" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.281797 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wgg2s_openshift-multus(344f8a87-e00f-4f0a-a0bc-aee197271160)\"" pod="openshift-multus/multus-wgg2s" podUID="344f8a87-e00f-4f0a-a0bc-aee197271160" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.287509 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovnkube-controller/3.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.298463 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovn-acl-logging/0.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.300391 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2m855_1e6a2d66-f11a-48f6-8d86-5295cb917b7f/ovn-controller/0.log" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301068 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" exitCode=0 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301101 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" exitCode=0 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301111 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" exitCode=0 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301119 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" exitCode=0 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301129 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" exitCode=0 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301137 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" exitCode=0 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301145 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" exitCode=143 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301154 4806 generic.go:334] "Generic (PLEG): container finished" podID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" containerID="c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" exitCode=143 Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301177 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301210 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301221 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301230 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301239 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301248 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301259 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301270 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301277 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301283 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301290 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301296 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301303 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301309 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301314 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301319 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301326 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301333 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301339 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301345 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301350 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301355 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301360 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301365 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301370 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301375 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301380 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301387 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301395 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301418 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301423 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301429 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301434 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301439 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301445 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301450 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301456 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301470 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301477 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" event={"ID":"1e6a2d66-f11a-48f6-8d86-5295cb917b7f","Type":"ContainerDied","Data":"33663b71273cc60bd5ccd7cadbcb1760143995fe20b7c9bbe4419013de836616"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301486 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301492 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301498 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301503 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301508 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301513 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301518 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301525 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301530 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301535 4806 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.301295 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2m855" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.340060 4806 scope.go:117] "RemoveContainer" containerID="7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.341245 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2m855"] Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.346912 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2m855"] Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.360591 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365152 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365184 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-etc-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365202 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-ovn\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365220 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovn-node-metrics-cert\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365245 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qpbw\" (UniqueName: \"kubernetes.io/projected/caf7ce6c-1324-401e-af06-b7f3cae6b70d-kube-api-access-7qpbw\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365268 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-cni-netd\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365298 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-log-socket\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365325 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-slash\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365343 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-run-netns\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365371 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-systemd\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365423 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-var-lib-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365441 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-env-overrides\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365455 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-kubelet\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365471 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovnkube-config\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365488 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-systemd-units\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365517 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovnkube-script-lib\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365530 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-node-log\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365548 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365563 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-cni-bin\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365577 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365612 4806 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365625 4806 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365635 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk7gh\" (UniqueName: \"kubernetes.io/projected/1e6a2d66-f11a-48f6-8d86-5295cb917b7f-kube-api-access-bk7gh\") on node \"crc\" DevicePath \"\"" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365673 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.365954 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-var-lib-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366009 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-cni-netd\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366057 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-log-socket\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366082 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-slash\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366104 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-run-netns\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366149 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-etc-openvswitch\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366287 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-systemd\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366612 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-cni-bin\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366584 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-node-log\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366616 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-run-ovn-kubernetes\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366677 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-run-ovn\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366710 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366552 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-env-overrides\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366915 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-host-kubelet\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.366935 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caf7ce6c-1324-401e-af06-b7f3cae6b70d-systemd-units\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.367072 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovnkube-script-lib\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.367132 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovnkube-config\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.368079 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caf7ce6c-1324-401e-af06-b7f3cae6b70d-ovn-node-metrics-cert\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.378295 4806 scope.go:117] "RemoveContainer" containerID="c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.380865 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qpbw\" (UniqueName: \"kubernetes.io/projected/caf7ce6c-1324-401e-af06-b7f3cae6b70d-kube-api-access-7qpbw\") pod \"ovnkube-node-v8grf\" (UID: \"caf7ce6c-1324-401e-af06-b7f3cae6b70d\") " pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.391515 4806 scope.go:117] "RemoveContainer" containerID="119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.404377 4806 scope.go:117] "RemoveContainer" containerID="7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.417345 4806 scope.go:117] "RemoveContainer" containerID="1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.427622 4806 scope.go:117] "RemoveContainer" containerID="9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.439717 4806 scope.go:117] "RemoveContainer" containerID="a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.452636 4806 scope.go:117] "RemoveContainer" containerID="c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.466496 4806 scope.go:117] "RemoveContainer" containerID="26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.480654 4806 scope.go:117] "RemoveContainer" containerID="7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.481162 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": container with ID starting with 7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6 not found: ID does not exist" containerID="7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.481209 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} err="failed to get container status \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": rpc error: code = NotFound desc = could not find container \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": container with ID starting with 7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.481243 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.481682 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": container with ID starting with 16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17 not found: ID does not exist" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.481733 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} err="failed to get container status \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": rpc error: code = NotFound desc = could not find container \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": container with ID starting with 16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.481767 4806 scope.go:117] "RemoveContainer" containerID="c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.482060 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": container with ID starting with c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d not found: ID does not exist" containerID="c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.482098 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} err="failed to get container status \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": rpc error: code = NotFound desc = could not find container \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": container with ID starting with c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.482122 4806 scope.go:117] "RemoveContainer" containerID="119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.483209 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": container with ID starting with 119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de not found: ID does not exist" containerID="119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.483236 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} err="failed to get container status \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": rpc error: code = NotFound desc = could not find container \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": container with ID starting with 119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.483252 4806 scope.go:117] "RemoveContainer" containerID="7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.483566 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": container with ID starting with 7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa not found: ID does not exist" containerID="7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.483597 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} err="failed to get container status \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": rpc error: code = NotFound desc = could not find container \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": container with ID starting with 7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.483623 4806 scope.go:117] "RemoveContainer" containerID="1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.483850 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": container with ID starting with 1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918 not found: ID does not exist" containerID="1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.483881 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} err="failed to get container status \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": rpc error: code = NotFound desc = could not find container \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": container with ID starting with 1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.483900 4806 scope.go:117] "RemoveContainer" containerID="9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.484118 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": container with ID starting with 9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e not found: ID does not exist" containerID="9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.484153 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} err="failed to get container status \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": rpc error: code = NotFound desc = could not find container \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": container with ID starting with 9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.484175 4806 scope.go:117] "RemoveContainer" containerID="a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.484458 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": container with ID starting with a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707 not found: ID does not exist" containerID="a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.484489 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} err="failed to get container status \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": rpc error: code = NotFound desc = could not find container \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": container with ID starting with a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.484507 4806 scope.go:117] "RemoveContainer" containerID="c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.484836 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": container with ID starting with c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b not found: ID does not exist" containerID="c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.484925 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} err="failed to get container status \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": rpc error: code = NotFound desc = could not find container \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": container with ID starting with c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.484985 4806 scope.go:117] "RemoveContainer" containerID="26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1" Feb 17 15:29:34 crc kubenswrapper[4806]: E0217 15:29:34.485344 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": container with ID starting with 26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1 not found: ID does not exist" containerID="26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.485391 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} err="failed to get container status \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": rpc error: code = NotFound desc = could not find container \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": container with ID starting with 26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.485472 4806 scope.go:117] "RemoveContainer" containerID="7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.485965 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} err="failed to get container status \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": rpc error: code = NotFound desc = could not find container \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": container with ID starting with 7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.485991 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.486242 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} err="failed to get container status \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": rpc error: code = NotFound desc = could not find container \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": container with ID starting with 16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.486265 4806 scope.go:117] "RemoveContainer" containerID="c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.486534 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} err="failed to get container status \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": rpc error: code = NotFound desc = could not find container \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": container with ID starting with c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.486563 4806 scope.go:117] "RemoveContainer" containerID="119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.486765 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} err="failed to get container status \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": rpc error: code = NotFound desc = could not find container \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": container with ID starting with 119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.486791 4806 scope.go:117] "RemoveContainer" containerID="7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.488227 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} err="failed to get container status \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": rpc error: code = NotFound desc = could not find container \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": container with ID starting with 7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.488315 4806 scope.go:117] "RemoveContainer" containerID="1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.488752 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} err="failed to get container status \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": rpc error: code = NotFound desc = could not find container \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": container with ID starting with 1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.488784 4806 scope.go:117] "RemoveContainer" containerID="9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.489106 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} err="failed to get container status \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": rpc error: code = NotFound desc = could not find container \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": container with ID starting with 9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.489148 4806 scope.go:117] "RemoveContainer" containerID="a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.489472 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} err="failed to get container status \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": rpc error: code = NotFound desc = could not find container \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": container with ID starting with a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.489498 4806 scope.go:117] "RemoveContainer" containerID="c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.489787 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} err="failed to get container status \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": rpc error: code = NotFound desc = could not find container \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": container with ID starting with c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.489831 4806 scope.go:117] "RemoveContainer" containerID="26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.490125 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} err="failed to get container status \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": rpc error: code = NotFound desc = could not find container \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": container with ID starting with 26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.490154 4806 scope.go:117] "RemoveContainer" containerID="7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.490416 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} err="failed to get container status \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": rpc error: code = NotFound desc = could not find container \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": container with ID starting with 7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.490442 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.490772 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} err="failed to get container status \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": rpc error: code = NotFound desc = could not find container \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": container with ID starting with 16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.490798 4806 scope.go:117] "RemoveContainer" containerID="c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491084 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} err="failed to get container status \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": rpc error: code = NotFound desc = could not find container \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": container with ID starting with c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491111 4806 scope.go:117] "RemoveContainer" containerID="119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491374 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} err="failed to get container status \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": rpc error: code = NotFound desc = could not find container \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": container with ID starting with 119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491417 4806 scope.go:117] "RemoveContainer" containerID="7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491677 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} err="failed to get container status \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": rpc error: code = NotFound desc = could not find container \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": container with ID starting with 7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491703 4806 scope.go:117] "RemoveContainer" containerID="1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491947 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} err="failed to get container status \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": rpc error: code = NotFound desc = could not find container \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": container with ID starting with 1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.491973 4806 scope.go:117] "RemoveContainer" containerID="9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.492210 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} err="failed to get container status \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": rpc error: code = NotFound desc = could not find container \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": container with ID starting with 9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.492236 4806 scope.go:117] "RemoveContainer" containerID="a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.492642 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} err="failed to get container status \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": rpc error: code = NotFound desc = could not find container \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": container with ID starting with a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.492669 4806 scope.go:117] "RemoveContainer" containerID="c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.492923 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} err="failed to get container status \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": rpc error: code = NotFound desc = could not find container \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": container with ID starting with c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.492950 4806 scope.go:117] "RemoveContainer" containerID="26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.493196 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} err="failed to get container status \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": rpc error: code = NotFound desc = could not find container \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": container with ID starting with 26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.493222 4806 scope.go:117] "RemoveContainer" containerID="7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.493526 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6"} err="failed to get container status \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": rpc error: code = NotFound desc = could not find container \"7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6\": container with ID starting with 7014672af3da6a303c0508496988c4c5f125cd7a8e3042bcf8d5c52bfe91bfe6 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.493561 4806 scope.go:117] "RemoveContainer" containerID="16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.493854 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17"} err="failed to get container status \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": rpc error: code = NotFound desc = could not find container \"16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17\": container with ID starting with 16a0fa6c4f64e62e5bc39f8de8555300bdad37c8b1863fe2fddd3d371b4bdf17 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.493889 4806 scope.go:117] "RemoveContainer" containerID="c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.494156 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d"} err="failed to get container status \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": rpc error: code = NotFound desc = could not find container \"c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d\": container with ID starting with c4c8a444a3b69b7c0f5413ab46ad4631c7dd4c67e038ac43311c8779b9a8624d not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.494183 4806 scope.go:117] "RemoveContainer" containerID="119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.494445 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de"} err="failed to get container status \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": rpc error: code = NotFound desc = could not find container \"119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de\": container with ID starting with 119b898fdafd081b590a7441a43028c0f9ba80c5328a125f2107aaf333c585de not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.494471 4806 scope.go:117] "RemoveContainer" containerID="7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.494720 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa"} err="failed to get container status \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": rpc error: code = NotFound desc = could not find container \"7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa\": container with ID starting with 7a18cf3ec8a0a5974916f67a708d64ac2c4ce06cc52dce1f0c92db825a298faa not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.494747 4806 scope.go:117] "RemoveContainer" containerID="1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.494981 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918"} err="failed to get container status \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": rpc error: code = NotFound desc = could not find container \"1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918\": container with ID starting with 1a24d655e1c85e8371c108b124f7ad42399a672e6b9d2575f3c8be4babbc6918 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.495022 4806 scope.go:117] "RemoveContainer" containerID="9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.495287 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e"} err="failed to get container status \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": rpc error: code = NotFound desc = could not find container \"9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e\": container with ID starting with 9f99a057d33946ca626fe0c961cd9c853442af68391a6099ddd61130d826a17e not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.495312 4806 scope.go:117] "RemoveContainer" containerID="a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.495594 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707"} err="failed to get container status \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": rpc error: code = NotFound desc = could not find container \"a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707\": container with ID starting with a64273321355cd62d6b2e09d41a0920ba6280d560b12edbb60c0ed75faa43707 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.495618 4806 scope.go:117] "RemoveContainer" containerID="c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.496028 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b"} err="failed to get container status \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": rpc error: code = NotFound desc = could not find container \"c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b\": container with ID starting with c7e49bb04a479fffeaab2aa3d87c5d67c8dda32e488463e3ba4cba06d8cebf9b not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.496074 4806 scope.go:117] "RemoveContainer" containerID="26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.496370 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1"} err="failed to get container status \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": rpc error: code = NotFound desc = could not find container \"26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1\": container with ID starting with 26f3c29d7c5fd806c0b2d14026d2a172d9e7e1a293ee88c3298506c0206aa6d1 not found: ID does not exist" Feb 17 15:29:34 crc kubenswrapper[4806]: I0217 15:29:34.542274 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:35 crc kubenswrapper[4806]: I0217 15:29:35.169781 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6a2d66-f11a-48f6-8d86-5295cb917b7f" path="/var/lib/kubelet/pods/1e6a2d66-f11a-48f6-8d86-5295cb917b7f/volumes" Feb 17 15:29:35 crc kubenswrapper[4806]: I0217 15:29:35.308859 4806 generic.go:334] "Generic (PLEG): container finished" podID="caf7ce6c-1324-401e-af06-b7f3cae6b70d" containerID="6735337a279a7b5bdd204a1ba559643b414ea00cba9e398438d6482cf1dd1bc6" exitCode=0 Feb 17 15:29:35 crc kubenswrapper[4806]: I0217 15:29:35.308954 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerDied","Data":"6735337a279a7b5bdd204a1ba559643b414ea00cba9e398438d6482cf1dd1bc6"} Feb 17 15:29:35 crc kubenswrapper[4806]: I0217 15:29:35.309041 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"23113fe955f13287ab51be3931decb48459f1bb94cc3faa9b897847cea324ab7"} Feb 17 15:29:35 crc kubenswrapper[4806]: I0217 15:29:35.313839 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/2.log" Feb 17 15:29:36 crc kubenswrapper[4806]: I0217 15:29:36.329389 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"5f6b240ea87e0a7b5bcaacf2508e08c93af54a9994cd4c112eb175032eb786a3"} Feb 17 15:29:36 crc kubenswrapper[4806]: I0217 15:29:36.330326 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"7a012b4dc9e775bbad2f23d0fc06362c4e9358c683f8ef101ba7913fb3931ace"} Feb 17 15:29:36 crc kubenswrapper[4806]: I0217 15:29:36.330348 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"2cb713f9c3f25ecce51773b0bf42656f8d026568182e8087281a3616d8cbe8ee"} Feb 17 15:29:36 crc kubenswrapper[4806]: I0217 15:29:36.330360 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"540dc6b80e7a54615232d20496b310dab89971a24ee616d1a1ba5d36378e8ce2"} Feb 17 15:29:36 crc kubenswrapper[4806]: I0217 15:29:36.330371 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"87e34fae85286dbaacef45e1d194093556f20505a7fa2ef0b5546bd019afe405"} Feb 17 15:29:36 crc kubenswrapper[4806]: I0217 15:29:36.330385 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"21d23b4e1be524cf95ea50bc6e79acc2fdbf5b563be6f49ed22b5b8df7469cab"} Feb 17 15:29:39 crc kubenswrapper[4806]: I0217 15:29:39.358654 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"71fdf8223c668e72ea2fceb9eee7e2f09046054aa4a6b407d6617e376a671e82"} Feb 17 15:29:41 crc kubenswrapper[4806]: I0217 15:29:41.376106 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" event={"ID":"caf7ce6c-1324-401e-af06-b7f3cae6b70d","Type":"ContainerStarted","Data":"3e09b272e27d94c76860f99f9c2fff39610a197946cf94432cf9970ae04c7d13"} Feb 17 15:29:41 crc kubenswrapper[4806]: I0217 15:29:41.376908 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:41 crc kubenswrapper[4806]: I0217 15:29:41.376980 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:41 crc kubenswrapper[4806]: I0217 15:29:41.421603 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:41 crc kubenswrapper[4806]: I0217 15:29:41.459582 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" podStartSLOduration=7.459539344 podStartE2EDuration="7.459539344s" podCreationTimestamp="2026-02-17 15:29:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:29:41.404961583 +0000 UTC m=+542.935592034" watchObservedRunningTime="2026-02-17 15:29:41.459539344 +0000 UTC m=+542.990169755" Feb 17 15:29:42 crc kubenswrapper[4806]: I0217 15:29:42.384104 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:42 crc kubenswrapper[4806]: I0217 15:29:42.419792 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:29:47 crc kubenswrapper[4806]: I0217 15:29:47.162768 4806 scope.go:117] "RemoveContainer" containerID="568e6de38434e4eebd27b387fdedb2cd85a0c8630950783ecdb5f0697f6d7faa" Feb 17 15:29:47 crc kubenswrapper[4806]: E0217 15:29:47.164045 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wgg2s_openshift-multus(344f8a87-e00f-4f0a-a0bc-aee197271160)\"" pod="openshift-multus/multus-wgg2s" podUID="344f8a87-e00f-4f0a-a0bc-aee197271160" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.195815 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz"] Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.197231 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.199360 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.199360 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.207215 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz"] Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.345524 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/880f5a95-9039-40c3-9d94-432012ba725e-secret-volume\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.345645 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l8h5\" (UniqueName: \"kubernetes.io/projected/880f5a95-9039-40c3-9d94-432012ba725e-kube-api-access-6l8h5\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.345748 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/880f5a95-9039-40c3-9d94-432012ba725e-config-volume\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.447642 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/880f5a95-9039-40c3-9d94-432012ba725e-secret-volume\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.447698 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l8h5\" (UniqueName: \"kubernetes.io/projected/880f5a95-9039-40c3-9d94-432012ba725e-kube-api-access-6l8h5\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.447736 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/880f5a95-9039-40c3-9d94-432012ba725e-config-volume\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.448589 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/880f5a95-9039-40c3-9d94-432012ba725e-config-volume\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.457462 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/880f5a95-9039-40c3-9d94-432012ba725e-secret-volume\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.469285 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l8h5\" (UniqueName: \"kubernetes.io/projected/880f5a95-9039-40c3-9d94-432012ba725e-kube-api-access-6l8h5\") pod \"collect-profiles-29522370-hn6jz\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: I0217 15:30:00.544372 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: E0217 15:30:00.584213 4806 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(c3a42201e07f9f3ef29c7f042beffb8f25eb07610140dad1be9ce8d953998c17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 15:30:00 crc kubenswrapper[4806]: E0217 15:30:00.584292 4806 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(c3a42201e07f9f3ef29c7f042beffb8f25eb07610140dad1be9ce8d953998c17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: E0217 15:30:00.584320 4806 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(c3a42201e07f9f3ef29c7f042beffb8f25eb07610140dad1be9ce8d953998c17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:00 crc kubenswrapper[4806]: E0217 15:30:00.584375 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager(880f5a95-9039-40c3-9d94-432012ba725e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager(880f5a95-9039-40c3-9d94-432012ba725e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(c3a42201e07f9f3ef29c7f042beffb8f25eb07610140dad1be9ce8d953998c17): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" podUID="880f5a95-9039-40c3-9d94-432012ba725e" Feb 17 15:30:01 crc kubenswrapper[4806]: I0217 15:30:01.507330 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:01 crc kubenswrapper[4806]: I0217 15:30:01.508546 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:01 crc kubenswrapper[4806]: E0217 15:30:01.548609 4806 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(fafc24c5775c13dad067d07cd217aa05e3ceee12a76998aca5d9c1eae8093a11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 15:30:01 crc kubenswrapper[4806]: E0217 15:30:01.548715 4806 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(fafc24c5775c13dad067d07cd217aa05e3ceee12a76998aca5d9c1eae8093a11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:01 crc kubenswrapper[4806]: E0217 15:30:01.548771 4806 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(fafc24c5775c13dad067d07cd217aa05e3ceee12a76998aca5d9c1eae8093a11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:01 crc kubenswrapper[4806]: E0217 15:30:01.548886 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager(880f5a95-9039-40c3-9d94-432012ba725e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager(880f5a95-9039-40c3-9d94-432012ba725e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29522370-hn6jz_openshift-operator-lifecycle-manager_880f5a95-9039-40c3-9d94-432012ba725e_0(fafc24c5775c13dad067d07cd217aa05e3ceee12a76998aca5d9c1eae8093a11): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" podUID="880f5a95-9039-40c3-9d94-432012ba725e" Feb 17 15:30:02 crc kubenswrapper[4806]: I0217 15:30:02.161241 4806 scope.go:117] "RemoveContainer" containerID="568e6de38434e4eebd27b387fdedb2cd85a0c8630950783ecdb5f0697f6d7faa" Feb 17 15:30:02 crc kubenswrapper[4806]: I0217 15:30:02.517135 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgg2s_344f8a87-e00f-4f0a-a0bc-aee197271160/kube-multus/2.log" Feb 17 15:30:02 crc kubenswrapper[4806]: I0217 15:30:02.517801 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgg2s" event={"ID":"344f8a87-e00f-4f0a-a0bc-aee197271160","Type":"ContainerStarted","Data":"36159217454f5e2b72d9c969295585c02a9e03681b0a853c5c686af14406527b"} Feb 17 15:30:04 crc kubenswrapper[4806]: I0217 15:30:04.579440 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v8grf" Feb 17 15:30:04 crc kubenswrapper[4806]: I0217 15:30:04.785825 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:30:04 crc kubenswrapper[4806]: I0217 15:30:04.785899 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.270968 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x"] Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.273112 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.276056 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.290106 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x"] Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.406212 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.406454 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvt6m\" (UniqueName: \"kubernetes.io/projected/42616f66-dce6-45b0-b11e-2802747e1212-kube-api-access-xvt6m\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.406577 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.507784 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.508523 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvt6m\" (UniqueName: \"kubernetes.io/projected/42616f66-dce6-45b0-b11e-2802747e1212-kube-api-access-xvt6m\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.508579 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.508632 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.509083 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.538034 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvt6m\" (UniqueName: \"kubernetes.io/projected/42616f66-dce6-45b0-b11e-2802747e1212-kube-api-access-xvt6m\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.599727 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:09 crc kubenswrapper[4806]: I0217 15:30:09.877097 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x"] Feb 17 15:30:10 crc kubenswrapper[4806]: I0217 15:30:10.574182 4806 generic.go:334] "Generic (PLEG): container finished" podID="42616f66-dce6-45b0-b11e-2802747e1212" containerID="a13917c59d887605f2aebbaa64d2ef55127954af685c7b82a7820c6266dadaca" exitCode=0 Feb 17 15:30:10 crc kubenswrapper[4806]: I0217 15:30:10.574298 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" event={"ID":"42616f66-dce6-45b0-b11e-2802747e1212","Type":"ContainerDied","Data":"a13917c59d887605f2aebbaa64d2ef55127954af685c7b82a7820c6266dadaca"} Feb 17 15:30:10 crc kubenswrapper[4806]: I0217 15:30:10.575122 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" event={"ID":"42616f66-dce6-45b0-b11e-2802747e1212","Type":"ContainerStarted","Data":"cedb4967c387ac9634f2f95517a82a9459b949cb1c78575115732d95ec21eda2"} Feb 17 15:30:10 crc kubenswrapper[4806]: I0217 15:30:10.580440 4806 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:30:12 crc kubenswrapper[4806]: I0217 15:30:12.594266 4806 generic.go:334] "Generic (PLEG): container finished" podID="42616f66-dce6-45b0-b11e-2802747e1212" containerID="3057cb30d80db6c622ce6ebadacee652f67b5904e4c6786f05e21b432da336d2" exitCode=0 Feb 17 15:30:12 crc kubenswrapper[4806]: I0217 15:30:12.594332 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" event={"ID":"42616f66-dce6-45b0-b11e-2802747e1212","Type":"ContainerDied","Data":"3057cb30d80db6c622ce6ebadacee652f67b5904e4c6786f05e21b432da336d2"} Feb 17 15:30:13 crc kubenswrapper[4806]: I0217 15:30:13.606359 4806 generic.go:334] "Generic (PLEG): container finished" podID="42616f66-dce6-45b0-b11e-2802747e1212" containerID="d13362e259394157f0f1b9b1b067bbace7366a7b1dd206a58c5f00f48701632b" exitCode=0 Feb 17 15:30:13 crc kubenswrapper[4806]: I0217 15:30:13.606475 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" event={"ID":"42616f66-dce6-45b0-b11e-2802747e1212","Type":"ContainerDied","Data":"d13362e259394157f0f1b9b1b067bbace7366a7b1dd206a58c5f00f48701632b"} Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.160719 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.161275 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.443105 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz"] Feb 17 15:30:14 crc kubenswrapper[4806]: W0217 15:30:14.449897 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod880f5a95_9039_40c3_9d94_432012ba725e.slice/crio-b28c57ad50815319a407dcbc5900db9c52661f07e78e449492261319f6b46bc6 WatchSource:0}: Error finding container b28c57ad50815319a407dcbc5900db9c52661f07e78e449492261319f6b46bc6: Status 404 returned error can't find the container with id b28c57ad50815319a407dcbc5900db9c52661f07e78e449492261319f6b46bc6 Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.615965 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" event={"ID":"880f5a95-9039-40c3-9d94-432012ba725e","Type":"ContainerStarted","Data":"a3877f226e6fda06c58b49cbc36d5018694613473a91fc0a95b87ff19508da8c"} Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.616013 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" event={"ID":"880f5a95-9039-40c3-9d94-432012ba725e","Type":"ContainerStarted","Data":"b28c57ad50815319a407dcbc5900db9c52661f07e78e449492261319f6b46bc6"} Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.633880 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" podStartSLOduration=14.63385372 podStartE2EDuration="14.63385372s" podCreationTimestamp="2026-02-17 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:30:14.633570603 +0000 UTC m=+576.164201014" watchObservedRunningTime="2026-02-17 15:30:14.63385372 +0000 UTC m=+576.164484161" Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.892333 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.997609 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-bundle\") pod \"42616f66-dce6-45b0-b11e-2802747e1212\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.997728 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-util\") pod \"42616f66-dce6-45b0-b11e-2802747e1212\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.997758 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvt6m\" (UniqueName: \"kubernetes.io/projected/42616f66-dce6-45b0-b11e-2802747e1212-kube-api-access-xvt6m\") pod \"42616f66-dce6-45b0-b11e-2802747e1212\" (UID: \"42616f66-dce6-45b0-b11e-2802747e1212\") " Feb 17 15:30:14 crc kubenswrapper[4806]: I0217 15:30:14.999629 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-bundle" (OuterVolumeSpecName: "bundle") pod "42616f66-dce6-45b0-b11e-2802747e1212" (UID: "42616f66-dce6-45b0-b11e-2802747e1212"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.004672 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42616f66-dce6-45b0-b11e-2802747e1212-kube-api-access-xvt6m" (OuterVolumeSpecName: "kube-api-access-xvt6m") pod "42616f66-dce6-45b0-b11e-2802747e1212" (UID: "42616f66-dce6-45b0-b11e-2802747e1212"). InnerVolumeSpecName "kube-api-access-xvt6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.031189 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-util" (OuterVolumeSpecName: "util") pod "42616f66-dce6-45b0-b11e-2802747e1212" (UID: "42616f66-dce6-45b0-b11e-2802747e1212"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.099304 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.099859 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/42616f66-dce6-45b0-b11e-2802747e1212-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.099883 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvt6m\" (UniqueName: \"kubernetes.io/projected/42616f66-dce6-45b0-b11e-2802747e1212-kube-api-access-xvt6m\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.624900 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" event={"ID":"42616f66-dce6-45b0-b11e-2802747e1212","Type":"ContainerDied","Data":"cedb4967c387ac9634f2f95517a82a9459b949cb1c78575115732d95ec21eda2"} Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.624950 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cedb4967c387ac9634f2f95517a82a9459b949cb1c78575115732d95ec21eda2" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.625004 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x" Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.627318 4806 generic.go:334] "Generic (PLEG): container finished" podID="880f5a95-9039-40c3-9d94-432012ba725e" containerID="a3877f226e6fda06c58b49cbc36d5018694613473a91fc0a95b87ff19508da8c" exitCode=0 Feb 17 15:30:15 crc kubenswrapper[4806]: I0217 15:30:15.627363 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" event={"ID":"880f5a95-9039-40c3-9d94-432012ba725e","Type":"ContainerDied","Data":"a3877f226e6fda06c58b49cbc36d5018694613473a91fc0a95b87ff19508da8c"} Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.000177 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.133252 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l8h5\" (UniqueName: \"kubernetes.io/projected/880f5a95-9039-40c3-9d94-432012ba725e-kube-api-access-6l8h5\") pod \"880f5a95-9039-40c3-9d94-432012ba725e\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.133375 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/880f5a95-9039-40c3-9d94-432012ba725e-secret-volume\") pod \"880f5a95-9039-40c3-9d94-432012ba725e\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.133729 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/880f5a95-9039-40c3-9d94-432012ba725e-config-volume\") pod \"880f5a95-9039-40c3-9d94-432012ba725e\" (UID: \"880f5a95-9039-40c3-9d94-432012ba725e\") " Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.134958 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/880f5a95-9039-40c3-9d94-432012ba725e-config-volume" (OuterVolumeSpecName: "config-volume") pod "880f5a95-9039-40c3-9d94-432012ba725e" (UID: "880f5a95-9039-40c3-9d94-432012ba725e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.140647 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880f5a95-9039-40c3-9d94-432012ba725e-kube-api-access-6l8h5" (OuterVolumeSpecName: "kube-api-access-6l8h5") pod "880f5a95-9039-40c3-9d94-432012ba725e" (UID: "880f5a95-9039-40c3-9d94-432012ba725e"). InnerVolumeSpecName "kube-api-access-6l8h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.141607 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880f5a95-9039-40c3-9d94-432012ba725e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "880f5a95-9039-40c3-9d94-432012ba725e" (UID: "880f5a95-9039-40c3-9d94-432012ba725e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.235948 4806 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/880f5a95-9039-40c3-9d94-432012ba725e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.235999 4806 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/880f5a95-9039-40c3-9d94-432012ba725e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.236013 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l8h5\" (UniqueName: \"kubernetes.io/projected/880f5a95-9039-40c3-9d94-432012ba725e-kube-api-access-6l8h5\") on node \"crc\" DevicePath \"\"" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.676528 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" event={"ID":"880f5a95-9039-40c3-9d94-432012ba725e","Type":"ContainerDied","Data":"b28c57ad50815319a407dcbc5900db9c52661f07e78e449492261319f6b46bc6"} Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.676587 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b28c57ad50815319a407dcbc5900db9c52661f07e78e449492261319f6b46bc6" Feb 17 15:30:17 crc kubenswrapper[4806]: I0217 15:30:17.676667 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522370-hn6jz" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.317419 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t"] Feb 17 15:30:25 crc kubenswrapper[4806]: E0217 15:30:25.319589 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42616f66-dce6-45b0-b11e-2802747e1212" containerName="pull" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.319673 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="42616f66-dce6-45b0-b11e-2802747e1212" containerName="pull" Feb 17 15:30:25 crc kubenswrapper[4806]: E0217 15:30:25.319729 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42616f66-dce6-45b0-b11e-2802747e1212" containerName="util" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.319779 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="42616f66-dce6-45b0-b11e-2802747e1212" containerName="util" Feb 17 15:30:25 crc kubenswrapper[4806]: E0217 15:30:25.319832 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42616f66-dce6-45b0-b11e-2802747e1212" containerName="extract" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.319882 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="42616f66-dce6-45b0-b11e-2802747e1212" containerName="extract" Feb 17 15:30:25 crc kubenswrapper[4806]: E0217 15:30:25.319947 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="880f5a95-9039-40c3-9d94-432012ba725e" containerName="collect-profiles" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.319997 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="880f5a95-9039-40c3-9d94-432012ba725e" containerName="collect-profiles" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.320127 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="42616f66-dce6-45b0-b11e-2802747e1212" containerName="extract" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.320188 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="880f5a95-9039-40c3-9d94-432012ba725e" containerName="collect-profiles" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.320607 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.323803 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.323803 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.323922 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vbqlj" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.328078 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.329252 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.342845 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t"] Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.441958 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9pc9\" (UniqueName: \"kubernetes.io/projected/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-kube-api-access-j9pc9\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.442058 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-apiservice-cert\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.442103 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-webhook-cert\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.542968 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-apiservice-cert\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.543029 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-webhook-cert\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.543064 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9pc9\" (UniqueName: \"kubernetes.io/projected/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-kube-api-access-j9pc9\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.549444 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-webhook-cert\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.550792 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-apiservice-cert\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.571958 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9pc9\" (UniqueName: \"kubernetes.io/projected/e08b3bf4-2745-4fd2-8cfa-1de763d3a957-kube-api-access-j9pc9\") pod \"metallb-operator-controller-manager-5bc4556c9c-hnh6t\" (UID: \"e08b3bf4-2745-4fd2-8cfa-1de763d3a957\") " pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.635603 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr"] Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.636604 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.636616 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.638242 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.638752 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.639506 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-m2jlk" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.650638 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr"] Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.745280 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-webhook-cert\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.745361 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptnbd\" (UniqueName: \"kubernetes.io/projected/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-kube-api-access-ptnbd\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.745412 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-apiservice-cert\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.848126 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-apiservice-cert\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.848700 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-webhook-cert\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.848822 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptnbd\" (UniqueName: \"kubernetes.io/projected/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-kube-api-access-ptnbd\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.853303 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-webhook-cert\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.854992 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-apiservice-cert\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.870737 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t"] Feb 17 15:30:25 crc kubenswrapper[4806]: I0217 15:30:25.871771 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptnbd\" (UniqueName: \"kubernetes.io/projected/1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02-kube-api-access-ptnbd\") pod \"metallb-operator-webhook-server-86df85fbff-5qgpr\" (UID: \"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02\") " pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:26 crc kubenswrapper[4806]: I0217 15:30:26.024710 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:26 crc kubenswrapper[4806]: I0217 15:30:26.264031 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr"] Feb 17 15:30:26 crc kubenswrapper[4806]: I0217 15:30:26.739058 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" event={"ID":"e08b3bf4-2745-4fd2-8cfa-1de763d3a957","Type":"ContainerStarted","Data":"fd94bbcdfe53800bf56244facb2b6f6a31d0d9601ca4f3159464c6cbcd948a80"} Feb 17 15:30:26 crc kubenswrapper[4806]: I0217 15:30:26.740732 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" event={"ID":"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02","Type":"ContainerStarted","Data":"9c9a5f963791b17c9aa5e2f8f9415b7023ffae0054f4638f94ff54b2ee10a87e"} Feb 17 15:30:28 crc kubenswrapper[4806]: I0217 15:30:28.751074 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" event={"ID":"e08b3bf4-2745-4fd2-8cfa-1de763d3a957","Type":"ContainerStarted","Data":"f1b5bfe946a9923c908b7e8069cd75d6d5dfc9b279d5cb22372bde05009ab074"} Feb 17 15:30:28 crc kubenswrapper[4806]: I0217 15:30:28.752061 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:30:28 crc kubenswrapper[4806]: I0217 15:30:28.771073 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" podStartSLOduration=1.214555252 podStartE2EDuration="3.771046458s" podCreationTimestamp="2026-02-17 15:30:25 +0000 UTC" firstStartedPulling="2026-02-17 15:30:25.881445093 +0000 UTC m=+587.412075504" lastFinishedPulling="2026-02-17 15:30:28.437936299 +0000 UTC m=+589.968566710" observedRunningTime="2026-02-17 15:30:28.76867052 +0000 UTC m=+590.299300921" watchObservedRunningTime="2026-02-17 15:30:28.771046458 +0000 UTC m=+590.301676869" Feb 17 15:30:31 crc kubenswrapper[4806]: I0217 15:30:31.770223 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" event={"ID":"1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02","Type":"ContainerStarted","Data":"581f77e18ff4c162a460c814e2c0a4371d756e93164727e101a3dc30774502ba"} Feb 17 15:30:31 crc kubenswrapper[4806]: I0217 15:30:31.770477 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:30:31 crc kubenswrapper[4806]: I0217 15:30:31.799851 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" podStartSLOduration=1.60959803 podStartE2EDuration="6.799826447s" podCreationTimestamp="2026-02-17 15:30:25 +0000 UTC" firstStartedPulling="2026-02-17 15:30:26.284085427 +0000 UTC m=+587.814715838" lastFinishedPulling="2026-02-17 15:30:31.474313834 +0000 UTC m=+593.004944255" observedRunningTime="2026-02-17 15:30:31.788840237 +0000 UTC m=+593.319470658" watchObservedRunningTime="2026-02-17 15:30:31.799826447 +0000 UTC m=+593.330456898" Feb 17 15:30:34 crc kubenswrapper[4806]: I0217 15:30:34.784264 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:30:34 crc kubenswrapper[4806]: I0217 15:30:34.784326 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:30:39 crc kubenswrapper[4806]: I0217 15:30:39.521704 4806 scope.go:117] "RemoveContainer" containerID="73c291f28f9e5d9e90a3c8c2571e1f5d971573ae8724eb803be9a74737105f21" Feb 17 15:30:46 crc kubenswrapper[4806]: I0217 15:30:46.030543 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-86df85fbff-5qgpr" Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.785293 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.786254 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.786341 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.787186 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c9f6cff3a70a1759104fc9fce3a1e5e0b42b5eabe6508edc074aa83565b2162"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.787265 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://3c9f6cff3a70a1759104fc9fce3a1e5e0b42b5eabe6508edc074aa83565b2162" gracePeriod=600 Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.993044 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="3c9f6cff3a70a1759104fc9fce3a1e5e0b42b5eabe6508edc074aa83565b2162" exitCode=0 Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.993120 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"3c9f6cff3a70a1759104fc9fce3a1e5e0b42b5eabe6508edc074aa83565b2162"} Feb 17 15:31:04 crc kubenswrapper[4806]: I0217 15:31:04.993551 4806 scope.go:117] "RemoveContainer" containerID="d3c5eb91e36e273fb32d5dc251978788cfa8103bc407e8fe5a600b96d3fdeb1b" Feb 17 15:31:05 crc kubenswrapper[4806]: I0217 15:31:05.640634 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5bc4556c9c-hnh6t" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.001232 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"1045b513b3ae59cf6bef863ffd1792d3b68fa1978b67b7c641086276ee981395"} Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.301461 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-m5dz4"] Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.304021 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.306954 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.307370 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh"] Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.307706 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.307903 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-j5k2t" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.308041 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.314386 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.320400 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh"] Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.408873 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-startup\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.408969 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-metrics\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.409012 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-conf\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.409038 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksp5q\" (UniqueName: \"kubernetes.io/projected/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-kube-api-access-ksp5q\") pod \"frr-k8s-webhook-server-78b44bf5bb-vvpxh\" (UID: \"15bfb41c-c1ca-453a-ad2b-04e10d1ce059\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.409086 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-sockets\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.409113 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-reloader\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.409135 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmx68\" (UniqueName: \"kubernetes.io/projected/ae60a61d-5eab-4c36-8cbd-412a743c2c87-kube-api-access-lmx68\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.409159 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-vvpxh\" (UID: \"15bfb41c-c1ca-453a-ad2b-04e10d1ce059\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.409208 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae60a61d-5eab-4c36-8cbd-412a743c2c87-metrics-certs\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.410342 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-84xmf"] Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.411475 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.414549 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.417470 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.426153 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.426185 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jxsd9" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.435836 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-fbsrm"] Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.436791 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.441269 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.458916 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-fbsrm"] Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513052 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-cert\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513135 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km8gx\" (UniqueName: \"kubernetes.io/projected/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-kube-api-access-km8gx\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513168 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-sockets\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513192 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513217 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-reloader\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513235 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmx68\" (UniqueName: \"kubernetes.io/projected/ae60a61d-5eab-4c36-8cbd-412a743c2c87-kube-api-access-lmx68\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513262 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-vvpxh\" (UID: \"15bfb41c-c1ca-453a-ad2b-04e10d1ce059\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513292 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae60a61d-5eab-4c36-8cbd-412a743c2c87-metrics-certs\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513317 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-startup\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513340 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-metrics-certs\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513371 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpnsb\" (UniqueName: \"kubernetes.io/projected/a673b557-0484-42f5-b6ac-211f25330796-kube-api-access-tpnsb\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513418 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-metrics\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513444 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metallb-excludel2\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513468 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metrics-certs\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513496 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksp5q\" (UniqueName: \"kubernetes.io/projected/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-kube-api-access-ksp5q\") pod \"frr-k8s-webhook-server-78b44bf5bb-vvpxh\" (UID: \"15bfb41c-c1ca-453a-ad2b-04e10d1ce059\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.513517 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-conf\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.514099 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-conf\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.514205 4806 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.514260 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-cert podName:15bfb41c-c1ca-453a-ad2b-04e10d1ce059 nodeName:}" failed. No retries permitted until 2026-02-17 15:31:07.01423654 +0000 UTC m=+628.544867041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-cert") pod "frr-k8s-webhook-server-78b44bf5bb-vvpxh" (UID: "15bfb41c-c1ca-453a-ad2b-04e10d1ce059") : secret "frr-k8s-webhook-server-cert" not found Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.516573 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-sockets\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.516795 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-reloader\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.517465 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ae60a61d-5eab-4c36-8cbd-412a743c2c87-frr-startup\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.519707 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ae60a61d-5eab-4c36-8cbd-412a743c2c87-metrics\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.537948 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae60a61d-5eab-4c36-8cbd-412a743c2c87-metrics-certs\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.540151 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmx68\" (UniqueName: \"kubernetes.io/projected/ae60a61d-5eab-4c36-8cbd-412a743c2c87-kube-api-access-lmx68\") pod \"frr-k8s-m5dz4\" (UID: \"ae60a61d-5eab-4c36-8cbd-412a743c2c87\") " pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.544168 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksp5q\" (UniqueName: \"kubernetes.io/projected/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-kube-api-access-ksp5q\") pod \"frr-k8s-webhook-server-78b44bf5bb-vvpxh\" (UID: \"15bfb41c-c1ca-453a-ad2b-04e10d1ce059\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.615337 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-metrics-certs\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.615727 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpnsb\" (UniqueName: \"kubernetes.io/projected/a673b557-0484-42f5-b6ac-211f25330796-kube-api-access-tpnsb\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.615485 4806 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.615757 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metallb-excludel2\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.615832 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-metrics-certs podName:a673b557-0484-42f5-b6ac-211f25330796 nodeName:}" failed. No retries permitted until 2026-02-17 15:31:07.115798291 +0000 UTC m=+628.646428752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-metrics-certs") pod "controller-69bbfbf88f-fbsrm" (UID: "a673b557-0484-42f5-b6ac-211f25330796") : secret "controller-certs-secret" not found Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.615859 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metrics-certs\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.615964 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-cert\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.615982 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km8gx\" (UniqueName: \"kubernetes.io/projected/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-kube-api-access-km8gx\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.616005 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.616122 4806 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.616207 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metrics-certs podName:6191e3ef-dd3f-4905-aa2b-282df9ef96b8 nodeName:}" failed. No retries permitted until 2026-02-17 15:31:07.11618545 +0000 UTC m=+628.646815911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metrics-certs") pod "speaker-84xmf" (UID: "6191e3ef-dd3f-4905-aa2b-282df9ef96b8") : secret "speaker-certs-secret" not found Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.616143 4806 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 15:31:06 crc kubenswrapper[4806]: E0217 15:31:06.616259 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist podName:6191e3ef-dd3f-4905-aa2b-282df9ef96b8 nodeName:}" failed. No retries permitted until 2026-02-17 15:31:07.116248242 +0000 UTC m=+628.646878663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist") pod "speaker-84xmf" (UID: "6191e3ef-dd3f-4905-aa2b-282df9ef96b8") : secret "metallb-memberlist" not found Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.616387 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metallb-excludel2\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.617654 4806 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.625374 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.632425 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-cert\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.639570 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km8gx\" (UniqueName: \"kubernetes.io/projected/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-kube-api-access-km8gx\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:06 crc kubenswrapper[4806]: I0217 15:31:06.643042 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpnsb\" (UniqueName: \"kubernetes.io/projected/a673b557-0484-42f5-b6ac-211f25330796-kube-api-access-tpnsb\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.007523 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerStarted","Data":"82ad5a6d0246710395dda16bcc92d8e2c47e5ed2b371b81e098cbbdf5e22ecf3"} Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.020512 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-vvpxh\" (UID: \"15bfb41c-c1ca-453a-ad2b-04e10d1ce059\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.025563 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15bfb41c-c1ca-453a-ad2b-04e10d1ce059-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-vvpxh\" (UID: \"15bfb41c-c1ca-453a-ad2b-04e10d1ce059\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.121300 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.121858 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-metrics-certs\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.121926 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metrics-certs\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:07 crc kubenswrapper[4806]: E0217 15:31:07.121716 4806 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 15:31:07 crc kubenswrapper[4806]: E0217 15:31:07.122340 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist podName:6191e3ef-dd3f-4905-aa2b-282df9ef96b8 nodeName:}" failed. No retries permitted until 2026-02-17 15:31:08.122315143 +0000 UTC m=+629.652945564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist") pod "speaker-84xmf" (UID: "6191e3ef-dd3f-4905-aa2b-282df9ef96b8") : secret "metallb-memberlist" not found Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.125939 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-metrics-certs\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.127012 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a673b557-0484-42f5-b6ac-211f25330796-metrics-certs\") pod \"controller-69bbfbf88f-fbsrm\" (UID: \"a673b557-0484-42f5-b6ac-211f25330796\") " pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.236365 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.355151 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.541924 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-fbsrm"] Feb 17 15:31:07 crc kubenswrapper[4806]: I0217 15:31:07.693017 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh"] Feb 17 15:31:07 crc kubenswrapper[4806]: W0217 15:31:07.696502 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15bfb41c_c1ca_453a_ad2b_04e10d1ce059.slice/crio-0b4d86ac23dd83e9cdc42864d4b5a2867350bd6bb9e09ac4cc29617cd0ad8e96 WatchSource:0}: Error finding container 0b4d86ac23dd83e9cdc42864d4b5a2867350bd6bb9e09ac4cc29617cd0ad8e96: Status 404 returned error can't find the container with id 0b4d86ac23dd83e9cdc42864d4b5a2867350bd6bb9e09ac4cc29617cd0ad8e96 Feb 17 15:31:08 crc kubenswrapper[4806]: I0217 15:31:08.015310 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-fbsrm" event={"ID":"a673b557-0484-42f5-b6ac-211f25330796","Type":"ContainerStarted","Data":"d115c6482c074fb9d5c90e8e68bc5a907f680ae69460d33fe92b76477a80e0f4"} Feb 17 15:31:08 crc kubenswrapper[4806]: I0217 15:31:08.015379 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-fbsrm" event={"ID":"a673b557-0484-42f5-b6ac-211f25330796","Type":"ContainerStarted","Data":"4d109a3bb86b6dbd02c9f9b3d66d353af23223042461828187db8e0c7f0e0c97"} Feb 17 15:31:08 crc kubenswrapper[4806]: I0217 15:31:08.016793 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" event={"ID":"15bfb41c-c1ca-453a-ad2b-04e10d1ce059","Type":"ContainerStarted","Data":"0b4d86ac23dd83e9cdc42864d4b5a2867350bd6bb9e09ac4cc29617cd0ad8e96"} Feb 17 15:31:08 crc kubenswrapper[4806]: I0217 15:31:08.137900 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:08 crc kubenswrapper[4806]: I0217 15:31:08.145267 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6191e3ef-dd3f-4905-aa2b-282df9ef96b8-memberlist\") pod \"speaker-84xmf\" (UID: \"6191e3ef-dd3f-4905-aa2b-282df9ef96b8\") " pod="metallb-system/speaker-84xmf" Feb 17 15:31:08 crc kubenswrapper[4806]: I0217 15:31:08.227056 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-84xmf" Feb 17 15:31:08 crc kubenswrapper[4806]: W0217 15:31:08.248693 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6191e3ef_dd3f_4905_aa2b_282df9ef96b8.slice/crio-9939082a00a69d21dd8a902c855a1d1f89e4bf69e240c047b71fc8e664358f94 WatchSource:0}: Error finding container 9939082a00a69d21dd8a902c855a1d1f89e4bf69e240c047b71fc8e664358f94: Status 404 returned error can't find the container with id 9939082a00a69d21dd8a902c855a1d1f89e4bf69e240c047b71fc8e664358f94 Feb 17 15:31:09 crc kubenswrapper[4806]: I0217 15:31:09.030141 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-84xmf" event={"ID":"6191e3ef-dd3f-4905-aa2b-282df9ef96b8","Type":"ContainerStarted","Data":"ab775d384b008a76ce0e324896ccd3371bf9d281981e6b5c5a077d9dbb3c3b94"} Feb 17 15:31:09 crc kubenswrapper[4806]: I0217 15:31:09.030638 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-84xmf" event={"ID":"6191e3ef-dd3f-4905-aa2b-282df9ef96b8","Type":"ContainerStarted","Data":"9939082a00a69d21dd8a902c855a1d1f89e4bf69e240c047b71fc8e664358f94"} Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.073876 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" event={"ID":"15bfb41c-c1ca-453a-ad2b-04e10d1ce059","Type":"ContainerStarted","Data":"7aa0f94b8a05817dc62ace84e62e9354b17a196d0202d6064c2651c8cf1b0347"} Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.074311 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.075923 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-84xmf" event={"ID":"6191e3ef-dd3f-4905-aa2b-282df9ef96b8","Type":"ContainerStarted","Data":"8ad552c3d38b539e48d39ba53b1adb3d3100f07509d9d4d2e77001e57c6767c1"} Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.076002 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-84xmf" Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.077782 4806 generic.go:334] "Generic (PLEG): container finished" podID="ae60a61d-5eab-4c36-8cbd-412a743c2c87" containerID="e42399c6a265d876d6a16b86569d189f2ecf7160070a0ac1053aa9afae6ff958" exitCode=0 Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.077840 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerDied","Data":"e42399c6a265d876d6a16b86569d189f2ecf7160070a0ac1053aa9afae6ff958"} Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.080927 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-fbsrm" event={"ID":"a673b557-0484-42f5-b6ac-211f25330796","Type":"ContainerStarted","Data":"399fcddcfe98c941c078ef6d9e9f4be27dbced0c057b6e6d80d02e64938999ee"} Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.081175 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.124152 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" podStartSLOduration=2.096438133 podStartE2EDuration="8.124130767s" podCreationTimestamp="2026-02-17 15:31:06 +0000 UTC" firstStartedPulling="2026-02-17 15:31:07.699700663 +0000 UTC m=+629.230331074" lastFinishedPulling="2026-02-17 15:31:13.727393297 +0000 UTC m=+635.258023708" observedRunningTime="2026-02-17 15:31:14.114969332 +0000 UTC m=+635.645599783" watchObservedRunningTime="2026-02-17 15:31:14.124130767 +0000 UTC m=+635.654761188" Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.187252 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-84xmf" podStartSLOduration=3.001573821 podStartE2EDuration="8.187234905s" podCreationTimestamp="2026-02-17 15:31:06 +0000 UTC" firstStartedPulling="2026-02-17 15:31:08.499829136 +0000 UTC m=+630.030459547" lastFinishedPulling="2026-02-17 15:31:13.68549022 +0000 UTC m=+635.216120631" observedRunningTime="2026-02-17 15:31:14.183499743 +0000 UTC m=+635.714130164" watchObservedRunningTime="2026-02-17 15:31:14.187234905 +0000 UTC m=+635.717865316" Feb 17 15:31:14 crc kubenswrapper[4806]: I0217 15:31:14.208979 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-fbsrm" podStartSLOduration=2.25986896 podStartE2EDuration="8.208956577s" podCreationTimestamp="2026-02-17 15:31:06 +0000 UTC" firstStartedPulling="2026-02-17 15:31:07.734732872 +0000 UTC m=+629.265363283" lastFinishedPulling="2026-02-17 15:31:13.683820489 +0000 UTC m=+635.214450900" observedRunningTime="2026-02-17 15:31:14.206834775 +0000 UTC m=+635.737465196" watchObservedRunningTime="2026-02-17 15:31:14.208956577 +0000 UTC m=+635.739586988" Feb 17 15:31:15 crc kubenswrapper[4806]: I0217 15:31:15.090149 4806 generic.go:334] "Generic (PLEG): container finished" podID="ae60a61d-5eab-4c36-8cbd-412a743c2c87" containerID="d9a169d2c08c674c62fc60d48ea5ed5cb384520f20aa493e0ea2ae85fff51ed3" exitCode=0 Feb 17 15:31:15 crc kubenswrapper[4806]: I0217 15:31:15.090303 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerDied","Data":"d9a169d2c08c674c62fc60d48ea5ed5cb384520f20aa493e0ea2ae85fff51ed3"} Feb 17 15:31:16 crc kubenswrapper[4806]: I0217 15:31:16.098286 4806 generic.go:334] "Generic (PLEG): container finished" podID="ae60a61d-5eab-4c36-8cbd-412a743c2c87" containerID="a668081dfc257a7d9b2417b72af707a87964288da27fb63910cefb6ea52c242c" exitCode=0 Feb 17 15:31:16 crc kubenswrapper[4806]: I0217 15:31:16.098356 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerDied","Data":"a668081dfc257a7d9b2417b72af707a87964288da27fb63910cefb6ea52c242c"} Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.108550 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerStarted","Data":"833afa886fac600b3de6bd5629bd19960e0b659df47b3f85e4825cff11845230"} Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.108927 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.108938 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerStarted","Data":"3965ff07b76497be288e796d7940b666202abbcc28db970811195cf1453a4dfb"} Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.108947 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerStarted","Data":"24a9a801567a8e7de36aca2692a8306dc936417683df47a76745278fe3d572c4"} Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.108958 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerStarted","Data":"4dc13e0e8e70f303a96859725868ccc9a1191a1e97fae1206b86e57dc12efebb"} Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.108967 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerStarted","Data":"3d293707606e0dc5d54072618524a5f9fa7458653338545672d1e2a279dcc6cb"} Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.108974 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-m5dz4" event={"ID":"ae60a61d-5eab-4c36-8cbd-412a743c2c87","Type":"ContainerStarted","Data":"6f7f275d67dc3a57ae260c6e652e0c686b7b628930f4badf1f4c503838033d4a"} Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.138194 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-m5dz4" podStartSLOduration=4.205598588 podStartE2EDuration="11.138171084s" podCreationTimestamp="2026-02-17 15:31:06 +0000 UTC" firstStartedPulling="2026-02-17 15:31:06.751932529 +0000 UTC m=+628.282562940" lastFinishedPulling="2026-02-17 15:31:13.684505015 +0000 UTC m=+635.215135436" observedRunningTime="2026-02-17 15:31:17.131224024 +0000 UTC m=+638.661854475" watchObservedRunningTime="2026-02-17 15:31:17.138171084 +0000 UTC m=+638.668801505" Feb 17 15:31:17 crc kubenswrapper[4806]: I0217 15:31:17.361987 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-fbsrm" Feb 17 15:31:18 crc kubenswrapper[4806]: I0217 15:31:18.231241 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-84xmf" Feb 17 15:31:21 crc kubenswrapper[4806]: I0217 15:31:21.626933 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:21 crc kubenswrapper[4806]: I0217 15:31:21.667737 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.002905 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-s2b9s"] Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.004781 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-s2b9s" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.009589 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.010194 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-4l5kb" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.010282 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.031022 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-s2b9s"] Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.054896 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmcn5\" (UniqueName: \"kubernetes.io/projected/3c46c85c-951a-4876-a6b4-3497994ea8d0-kube-api-access-jmcn5\") pod \"mariadb-operator-index-s2b9s\" (UID: \"3c46c85c-951a-4876-a6b4-3497994ea8d0\") " pod="openstack-operators/mariadb-operator-index-s2b9s" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.155704 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmcn5\" (UniqueName: \"kubernetes.io/projected/3c46c85c-951a-4876-a6b4-3497994ea8d0-kube-api-access-jmcn5\") pod \"mariadb-operator-index-s2b9s\" (UID: \"3c46c85c-951a-4876-a6b4-3497994ea8d0\") " pod="openstack-operators/mariadb-operator-index-s2b9s" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.183537 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmcn5\" (UniqueName: \"kubernetes.io/projected/3c46c85c-951a-4876-a6b4-3497994ea8d0-kube-api-access-jmcn5\") pod \"mariadb-operator-index-s2b9s\" (UID: \"3c46c85c-951a-4876-a6b4-3497994ea8d0\") " pod="openstack-operators/mariadb-operator-index-s2b9s" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.342972 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-s2b9s" Feb 17 15:31:24 crc kubenswrapper[4806]: I0217 15:31:24.611012 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-s2b9s"] Feb 17 15:31:25 crc kubenswrapper[4806]: I0217 15:31:25.173500 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-s2b9s" event={"ID":"3c46c85c-951a-4876-a6b4-3497994ea8d0","Type":"ContainerStarted","Data":"1f15b980549841e8dab35c7092a80f8444d23530f41a7f2c62a8dab80c346bcf"} Feb 17 15:31:26 crc kubenswrapper[4806]: I0217 15:31:26.632811 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-m5dz4" Feb 17 15:31:27 crc kubenswrapper[4806]: I0217 15:31:27.178675 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-s2b9s" event={"ID":"3c46c85c-951a-4876-a6b4-3497994ea8d0","Type":"ContainerStarted","Data":"fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13"} Feb 17 15:31:27 crc kubenswrapper[4806]: I0217 15:31:27.198476 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-s2b9s" podStartSLOduration=2.3413552 podStartE2EDuration="4.198444204s" podCreationTimestamp="2026-02-17 15:31:23 +0000 UTC" firstStartedPulling="2026-02-17 15:31:24.621246151 +0000 UTC m=+646.151876582" lastFinishedPulling="2026-02-17 15:31:26.478335165 +0000 UTC m=+648.008965586" observedRunningTime="2026-02-17 15:31:27.195094351 +0000 UTC m=+648.725724782" watchObservedRunningTime="2026-02-17 15:31:27.198444204 +0000 UTC m=+648.729074655" Feb 17 15:31:27 crc kubenswrapper[4806]: I0217 15:31:27.241002 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-vvpxh" Feb 17 15:31:27 crc kubenswrapper[4806]: I0217 15:31:27.361950 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-s2b9s"] Feb 17 15:31:27 crc kubenswrapper[4806]: I0217 15:31:27.968058 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-jdh9x"] Feb 17 15:31:27 crc kubenswrapper[4806]: I0217 15:31:27.969855 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:27 crc kubenswrapper[4806]: I0217 15:31:27.984274 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-jdh9x"] Feb 17 15:31:28 crc kubenswrapper[4806]: I0217 15:31:28.113266 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtl56\" (UniqueName: \"kubernetes.io/projected/46b07016-998e-4215-81bb-b2c71a8ccd82-kube-api-access-xtl56\") pod \"mariadb-operator-index-jdh9x\" (UID: \"46b07016-998e-4215-81bb-b2c71a8ccd82\") " pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:28 crc kubenswrapper[4806]: I0217 15:31:28.214664 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtl56\" (UniqueName: \"kubernetes.io/projected/46b07016-998e-4215-81bb-b2c71a8ccd82-kube-api-access-xtl56\") pod \"mariadb-operator-index-jdh9x\" (UID: \"46b07016-998e-4215-81bb-b2c71a8ccd82\") " pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:28 crc kubenswrapper[4806]: I0217 15:31:28.266508 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtl56\" (UniqueName: \"kubernetes.io/projected/46b07016-998e-4215-81bb-b2c71a8ccd82-kube-api-access-xtl56\") pod \"mariadb-operator-index-jdh9x\" (UID: \"46b07016-998e-4215-81bb-b2c71a8ccd82\") " pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:28 crc kubenswrapper[4806]: I0217 15:31:28.304326 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:28 crc kubenswrapper[4806]: I0217 15:31:28.494762 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-jdh9x"] Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.195316 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-jdh9x" event={"ID":"46b07016-998e-4215-81bb-b2c71a8ccd82","Type":"ContainerStarted","Data":"7069ac131d7e487a83e74cff673835ec1ed111e5670db26569ca62431e2e09b4"} Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.195382 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-jdh9x" event={"ID":"46b07016-998e-4215-81bb-b2c71a8ccd82","Type":"ContainerStarted","Data":"1e98ca1822d4eed759e6c63da1089a948b2269907ada95f794ffe646a1911fb0"} Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.197559 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-s2b9s" podUID="3c46c85c-951a-4876-a6b4-3497994ea8d0" containerName="registry-server" containerID="cri-o://fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13" gracePeriod=2 Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.216802 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-jdh9x" podStartSLOduration=1.800803591 podStartE2EDuration="2.216780432s" podCreationTimestamp="2026-02-17 15:31:27 +0000 UTC" firstStartedPulling="2026-02-17 15:31:28.508204265 +0000 UTC m=+650.038834676" lastFinishedPulling="2026-02-17 15:31:28.924181106 +0000 UTC m=+650.454811517" observedRunningTime="2026-02-17 15:31:29.214126457 +0000 UTC m=+650.744756908" watchObservedRunningTime="2026-02-17 15:31:29.216780432 +0000 UTC m=+650.747410863" Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.628580 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-s2b9s" Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.734383 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmcn5\" (UniqueName: \"kubernetes.io/projected/3c46c85c-951a-4876-a6b4-3497994ea8d0-kube-api-access-jmcn5\") pod \"3c46c85c-951a-4876-a6b4-3497994ea8d0\" (UID: \"3c46c85c-951a-4876-a6b4-3497994ea8d0\") " Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.741687 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c46c85c-951a-4876-a6b4-3497994ea8d0-kube-api-access-jmcn5" (OuterVolumeSpecName: "kube-api-access-jmcn5") pod "3c46c85c-951a-4876-a6b4-3497994ea8d0" (UID: "3c46c85c-951a-4876-a6b4-3497994ea8d0"). InnerVolumeSpecName "kube-api-access-jmcn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:31:29 crc kubenswrapper[4806]: I0217 15:31:29.835571 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmcn5\" (UniqueName: \"kubernetes.io/projected/3c46c85c-951a-4876-a6b4-3497994ea8d0-kube-api-access-jmcn5\") on node \"crc\" DevicePath \"\"" Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.201724 4806 generic.go:334] "Generic (PLEG): container finished" podID="3c46c85c-951a-4876-a6b4-3497994ea8d0" containerID="fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13" exitCode=0 Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.201799 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-s2b9s" Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.201797 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-s2b9s" event={"ID":"3c46c85c-951a-4876-a6b4-3497994ea8d0","Type":"ContainerDied","Data":"fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13"} Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.201864 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-s2b9s" event={"ID":"3c46c85c-951a-4876-a6b4-3497994ea8d0","Type":"ContainerDied","Data":"1f15b980549841e8dab35c7092a80f8444d23530f41a7f2c62a8dab80c346bcf"} Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.201889 4806 scope.go:117] "RemoveContainer" containerID="fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13" Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.238576 4806 scope.go:117] "RemoveContainer" containerID="fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13" Feb 17 15:31:30 crc kubenswrapper[4806]: E0217 15:31:30.239181 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13\": container with ID starting with fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13 not found: ID does not exist" containerID="fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13" Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.239342 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13"} err="failed to get container status \"fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13\": rpc error: code = NotFound desc = could not find container \"fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13\": container with ID starting with fe83ccb264350b6f7bb3922648180e4a3b370d99e22f72527766f3ae6124ab13 not found: ID does not exist" Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.246364 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-s2b9s"] Feb 17 15:31:30 crc kubenswrapper[4806]: I0217 15:31:30.252814 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-s2b9s"] Feb 17 15:31:31 crc kubenswrapper[4806]: I0217 15:31:31.175889 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c46c85c-951a-4876-a6b4-3497994ea8d0" path="/var/lib/kubelet/pods/3c46c85c-951a-4876-a6b4-3497994ea8d0/volumes" Feb 17 15:31:38 crc kubenswrapper[4806]: I0217 15:31:38.304870 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:38 crc kubenswrapper[4806]: I0217 15:31:38.305460 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:38 crc kubenswrapper[4806]: I0217 15:31:38.338637 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:39 crc kubenswrapper[4806]: I0217 15:31:39.318445 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-jdh9x" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.034660 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj"] Feb 17 15:31:45 crc kubenswrapper[4806]: E0217 15:31:45.035310 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c46c85c-951a-4876-a6b4-3497994ea8d0" containerName="registry-server" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.035323 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c46c85c-951a-4876-a6b4-3497994ea8d0" containerName="registry-server" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.035442 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c46c85c-951a-4876-a6b4-3497994ea8d0" containerName="registry-server" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.036228 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.039044 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5p4j2" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.050011 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj"] Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.151340 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-util\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.151640 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6crw\" (UniqueName: \"kubernetes.io/projected/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-kube-api-access-j6crw\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.151745 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-bundle\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.253033 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6crw\" (UniqueName: \"kubernetes.io/projected/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-kube-api-access-j6crw\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.253088 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-bundle\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.253553 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-bundle\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.253795 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-util\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.254685 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-util\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.282644 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6crw\" (UniqueName: \"kubernetes.io/projected/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-kube-api-access-j6crw\") pod \"4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.359477 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:45 crc kubenswrapper[4806]: I0217 15:31:45.855479 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj"] Feb 17 15:31:46 crc kubenswrapper[4806]: I0217 15:31:46.353711 4806 generic.go:334] "Generic (PLEG): container finished" podID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerID="33c1c0cd41a2cfb65ac2ef58fb4c3e497d7664db488c62c0a69ce550389641b7" exitCode=0 Feb 17 15:31:46 crc kubenswrapper[4806]: I0217 15:31:46.353761 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" event={"ID":"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322","Type":"ContainerDied","Data":"33c1c0cd41a2cfb65ac2ef58fb4c3e497d7664db488c62c0a69ce550389641b7"} Feb 17 15:31:46 crc kubenswrapper[4806]: I0217 15:31:46.353799 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" event={"ID":"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322","Type":"ContainerStarted","Data":"d46e2f41023e98fcb0b856f8f8aabfe39393b7082576adcd17a044cd60e362d0"} Feb 17 15:31:47 crc kubenswrapper[4806]: I0217 15:31:47.374368 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" event={"ID":"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322","Type":"ContainerStarted","Data":"9a8628b6c239ab793c39676926e7764d8fff19694e08ed2b0826b5a835e9e07c"} Feb 17 15:31:48 crc kubenswrapper[4806]: I0217 15:31:48.383311 4806 generic.go:334] "Generic (PLEG): container finished" podID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerID="9a8628b6c239ab793c39676926e7764d8fff19694e08ed2b0826b5a835e9e07c" exitCode=0 Feb 17 15:31:48 crc kubenswrapper[4806]: I0217 15:31:48.383353 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" event={"ID":"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322","Type":"ContainerDied","Data":"9a8628b6c239ab793c39676926e7764d8fff19694e08ed2b0826b5a835e9e07c"} Feb 17 15:31:49 crc kubenswrapper[4806]: I0217 15:31:49.394318 4806 generic.go:334] "Generic (PLEG): container finished" podID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerID="5fcdafd0adfe56880cbc5f83b2828f74b120bd1d3155589a7e7f2f43ca617a1c" exitCode=0 Feb 17 15:31:49 crc kubenswrapper[4806]: I0217 15:31:49.394367 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" event={"ID":"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322","Type":"ContainerDied","Data":"5fcdafd0adfe56880cbc5f83b2828f74b120bd1d3155589a7e7f2f43ca617a1c"} Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.728077 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.833725 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6crw\" (UniqueName: \"kubernetes.io/projected/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-kube-api-access-j6crw\") pod \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.833814 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-bundle\") pod \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.833849 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-util\") pod \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\" (UID: \"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322\") " Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.835498 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-bundle" (OuterVolumeSpecName: "bundle") pod "e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" (UID: "e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.840347 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-kube-api-access-j6crw" (OuterVolumeSpecName: "kube-api-access-j6crw") pod "e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" (UID: "e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322"). InnerVolumeSpecName "kube-api-access-j6crw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.849850 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-util" (OuterVolumeSpecName: "util") pod "e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" (UID: "e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.935063 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6crw\" (UniqueName: \"kubernetes.io/projected/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-kube-api-access-j6crw\") on node \"crc\" DevicePath \"\"" Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.935098 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:31:50 crc kubenswrapper[4806]: I0217 15:31:50.935109 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:31:51 crc kubenswrapper[4806]: I0217 15:31:51.411577 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" event={"ID":"e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322","Type":"ContainerDied","Data":"d46e2f41023e98fcb0b856f8f8aabfe39393b7082576adcd17a044cd60e362d0"} Feb 17 15:31:51 crc kubenswrapper[4806]: I0217 15:31:51.411643 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d46e2f41023e98fcb0b856f8f8aabfe39393b7082576adcd17a044cd60e362d0" Feb 17 15:31:51 crc kubenswrapper[4806]: I0217 15:31:51.411649 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.352249 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4"] Feb 17 15:31:58 crc kubenswrapper[4806]: E0217 15:31:58.354094 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerName="extract" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.354165 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerName="extract" Feb 17 15:31:58 crc kubenswrapper[4806]: E0217 15:31:58.354224 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerName="pull" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.354274 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerName="pull" Feb 17 15:31:58 crc kubenswrapper[4806]: E0217 15:31:58.354333 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerName="util" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.354387 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerName="util" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.354556 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322" containerName="extract" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.354978 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.356757 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.358008 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.358060 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-c6vzj" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.369336 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4"] Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.543130 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9f35401-32a2-47fd-b1d3-688724190542-webhook-cert\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.543250 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6hnn\" (UniqueName: \"kubernetes.io/projected/b9f35401-32a2-47fd-b1d3-688724190542-kube-api-access-h6hnn\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.543378 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9f35401-32a2-47fd-b1d3-688724190542-apiservice-cert\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.644282 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9f35401-32a2-47fd-b1d3-688724190542-webhook-cert\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.644756 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6hnn\" (UniqueName: \"kubernetes.io/projected/b9f35401-32a2-47fd-b1d3-688724190542-kube-api-access-h6hnn\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.644964 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9f35401-32a2-47fd-b1d3-688724190542-apiservice-cert\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.656031 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9f35401-32a2-47fd-b1d3-688724190542-webhook-cert\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.656053 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9f35401-32a2-47fd-b1d3-688724190542-apiservice-cert\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.672982 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6hnn\" (UniqueName: \"kubernetes.io/projected/b9f35401-32a2-47fd-b1d3-688724190542-kube-api-access-h6hnn\") pod \"mariadb-operator-controller-manager-6cd577c68-qswc4\" (UID: \"b9f35401-32a2-47fd-b1d3-688724190542\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:58 crc kubenswrapper[4806]: I0217 15:31:58.678545 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:31:59 crc kubenswrapper[4806]: I0217 15:31:59.210084 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4"] Feb 17 15:31:59 crc kubenswrapper[4806]: W0217 15:31:59.217827 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9f35401_32a2_47fd_b1d3_688724190542.slice/crio-d64c1515b44ab756140a39031a2d70d150f24dc864b293e16eea96c779811485 WatchSource:0}: Error finding container d64c1515b44ab756140a39031a2d70d150f24dc864b293e16eea96c779811485: Status 404 returned error can't find the container with id d64c1515b44ab756140a39031a2d70d150f24dc864b293e16eea96c779811485 Feb 17 15:31:59 crc kubenswrapper[4806]: I0217 15:31:59.461636 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" event={"ID":"b9f35401-32a2-47fd-b1d3-688724190542","Type":"ContainerStarted","Data":"d64c1515b44ab756140a39031a2d70d150f24dc864b293e16eea96c779811485"} Feb 17 15:32:02 crc kubenswrapper[4806]: I0217 15:32:02.479899 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" event={"ID":"b9f35401-32a2-47fd-b1d3-688724190542","Type":"ContainerStarted","Data":"ab1f7ff2da2f8c1d77fad8a354d39078a31d179ab2196099e199ad55f29cbaf8"} Feb 17 15:32:02 crc kubenswrapper[4806]: I0217 15:32:02.481955 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:32:02 crc kubenswrapper[4806]: I0217 15:32:02.505289 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" podStartSLOduration=1.488335116 podStartE2EDuration="4.505260772s" podCreationTimestamp="2026-02-17 15:31:58 +0000 UTC" firstStartedPulling="2026-02-17 15:31:59.223043038 +0000 UTC m=+680.753673449" lastFinishedPulling="2026-02-17 15:32:02.239968684 +0000 UTC m=+683.770599105" observedRunningTime="2026-02-17 15:32:02.50437044 +0000 UTC m=+684.035000871" watchObservedRunningTime="2026-02-17 15:32:02.505260772 +0000 UTC m=+684.035891223" Feb 17 15:32:08 crc kubenswrapper[4806]: I0217 15:32:08.687885 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd577c68-qswc4" Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.315173 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-hj996"] Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.317121 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.319128 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-24ls8" Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.349778 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-hj996"] Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.493828 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcfvg\" (UniqueName: \"kubernetes.io/projected/dc89bc49-330e-48df-8030-51ee628cb608-kube-api-access-xcfvg\") pod \"infra-operator-index-hj996\" (UID: \"dc89bc49-330e-48df-8030-51ee628cb608\") " pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.595305 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcfvg\" (UniqueName: \"kubernetes.io/projected/dc89bc49-330e-48df-8030-51ee628cb608-kube-api-access-xcfvg\") pod \"infra-operator-index-hj996\" (UID: \"dc89bc49-330e-48df-8030-51ee628cb608\") " pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.634094 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcfvg\" (UniqueName: \"kubernetes.io/projected/dc89bc49-330e-48df-8030-51ee628cb608-kube-api-access-xcfvg\") pod \"infra-operator-index-hj996\" (UID: \"dc89bc49-330e-48df-8030-51ee628cb608\") " pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:15 crc kubenswrapper[4806]: I0217 15:32:15.691908 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:16 crc kubenswrapper[4806]: I0217 15:32:16.147368 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-hj996"] Feb 17 15:32:16 crc kubenswrapper[4806]: I0217 15:32:16.571258 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-hj996" event={"ID":"dc89bc49-330e-48df-8030-51ee628cb608","Type":"ContainerStarted","Data":"62de3d5b886ba45931fae581a9a26d19b15434b936fe72b8dde763717d4549eb"} Feb 17 15:32:18 crc kubenswrapper[4806]: I0217 15:32:18.590536 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-hj996" event={"ID":"dc89bc49-330e-48df-8030-51ee628cb608","Type":"ContainerStarted","Data":"a1cb2a54bc42622837b664c866fef5d1289333266a76324a03de28b5d02b9e09"} Feb 17 15:32:18 crc kubenswrapper[4806]: I0217 15:32:18.620739 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-hj996" podStartSLOduration=1.557927426 podStartE2EDuration="3.620700066s" podCreationTimestamp="2026-02-17 15:32:15 +0000 UTC" firstStartedPulling="2026-02-17 15:32:16.157990456 +0000 UTC m=+697.688620867" lastFinishedPulling="2026-02-17 15:32:18.220763066 +0000 UTC m=+699.751393507" observedRunningTime="2026-02-17 15:32:18.614819022 +0000 UTC m=+700.145449523" watchObservedRunningTime="2026-02-17 15:32:18.620700066 +0000 UTC m=+700.151330517" Feb 17 15:32:25 crc kubenswrapper[4806]: I0217 15:32:25.693276 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:25 crc kubenswrapper[4806]: I0217 15:32:25.694632 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:25 crc kubenswrapper[4806]: I0217 15:32:25.732796 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:26 crc kubenswrapper[4806]: I0217 15:32:26.678446 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-hj996" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.191846 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z"] Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.193963 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.206258 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5p4j2" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.254819 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z"] Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.261964 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkhm\" (UniqueName: \"kubernetes.io/projected/e94031f9-ac0c-4950-b703-2133541e2cf1-kube-api-access-dwkhm\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.262144 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-util\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.262191 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-bundle\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.363065 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-util\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.363135 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-bundle\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.363192 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkhm\" (UniqueName: \"kubernetes.io/projected/e94031f9-ac0c-4950-b703-2133541e2cf1-kube-api-access-dwkhm\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.363694 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-util\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.363704 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-bundle\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.391607 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkhm\" (UniqueName: \"kubernetes.io/projected/e94031f9-ac0c-4950-b703-2133541e2cf1-kube-api-access-dwkhm\") pod \"97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:33 crc kubenswrapper[4806]: I0217 15:32:33.568247 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:34 crc kubenswrapper[4806]: I0217 15:32:34.004037 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z"] Feb 17 15:32:34 crc kubenswrapper[4806]: W0217 15:32:34.008847 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode94031f9_ac0c_4950_b703_2133541e2cf1.slice/crio-06edd0f3e25769d26ed041a0ca0ba9dc182ae61985bc0d5d8accb14a1d4d514f WatchSource:0}: Error finding container 06edd0f3e25769d26ed041a0ca0ba9dc182ae61985bc0d5d8accb14a1d4d514f: Status 404 returned error can't find the container with id 06edd0f3e25769d26ed041a0ca0ba9dc182ae61985bc0d5d8accb14a1d4d514f Feb 17 15:32:34 crc kubenswrapper[4806]: I0217 15:32:34.712394 4806 generic.go:334] "Generic (PLEG): container finished" podID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerID="2a36fe635762d737f057623a74c66f1b4384b6d4c4e40f97afeae1f3965f7fdd" exitCode=0 Feb 17 15:32:34 crc kubenswrapper[4806]: I0217 15:32:34.712627 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" event={"ID":"e94031f9-ac0c-4950-b703-2133541e2cf1","Type":"ContainerDied","Data":"2a36fe635762d737f057623a74c66f1b4384b6d4c4e40f97afeae1f3965f7fdd"} Feb 17 15:32:34 crc kubenswrapper[4806]: I0217 15:32:34.712877 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" event={"ID":"e94031f9-ac0c-4950-b703-2133541e2cf1","Type":"ContainerStarted","Data":"06edd0f3e25769d26ed041a0ca0ba9dc182ae61985bc0d5d8accb14a1d4d514f"} Feb 17 15:32:35 crc kubenswrapper[4806]: I0217 15:32:35.724302 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" event={"ID":"e94031f9-ac0c-4950-b703-2133541e2cf1","Type":"ContainerStarted","Data":"2ae80e3e81267f2c9ed68c8c0c6f3a7e2db382f8e68a1a7af6bdf3a27f178e80"} Feb 17 15:32:36 crc kubenswrapper[4806]: I0217 15:32:36.735584 4806 generic.go:334] "Generic (PLEG): container finished" podID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerID="2ae80e3e81267f2c9ed68c8c0c6f3a7e2db382f8e68a1a7af6bdf3a27f178e80" exitCode=0 Feb 17 15:32:36 crc kubenswrapper[4806]: I0217 15:32:36.735630 4806 generic.go:334] "Generic (PLEG): container finished" podID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerID="0bb9d3bb499ac2b020d278c8c8778ea1cb04f874b613f9223052b9475b998580" exitCode=0 Feb 17 15:32:36 crc kubenswrapper[4806]: I0217 15:32:36.735693 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" event={"ID":"e94031f9-ac0c-4950-b703-2133541e2cf1","Type":"ContainerDied","Data":"2ae80e3e81267f2c9ed68c8c0c6f3a7e2db382f8e68a1a7af6bdf3a27f178e80"} Feb 17 15:32:36 crc kubenswrapper[4806]: I0217 15:32:36.735773 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" event={"ID":"e94031f9-ac0c-4950-b703-2133541e2cf1","Type":"ContainerDied","Data":"0bb9d3bb499ac2b020d278c8c8778ea1cb04f874b613f9223052b9475b998580"} Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.118592 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.141498 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwkhm\" (UniqueName: \"kubernetes.io/projected/e94031f9-ac0c-4950-b703-2133541e2cf1-kube-api-access-dwkhm\") pod \"e94031f9-ac0c-4950-b703-2133541e2cf1\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.141716 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-util\") pod \"e94031f9-ac0c-4950-b703-2133541e2cf1\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.141799 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-bundle\") pod \"e94031f9-ac0c-4950-b703-2133541e2cf1\" (UID: \"e94031f9-ac0c-4950-b703-2133541e2cf1\") " Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.147905 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-bundle" (OuterVolumeSpecName: "bundle") pod "e94031f9-ac0c-4950-b703-2133541e2cf1" (UID: "e94031f9-ac0c-4950-b703-2133541e2cf1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.160779 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94031f9-ac0c-4950-b703-2133541e2cf1-kube-api-access-dwkhm" (OuterVolumeSpecName: "kube-api-access-dwkhm") pod "e94031f9-ac0c-4950-b703-2133541e2cf1" (UID: "e94031f9-ac0c-4950-b703-2133541e2cf1"). InnerVolumeSpecName "kube-api-access-dwkhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.175140 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-util" (OuterVolumeSpecName: "util") pod "e94031f9-ac0c-4950-b703-2133541e2cf1" (UID: "e94031f9-ac0c-4950-b703-2133541e2cf1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.243862 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.244283 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e94031f9-ac0c-4950-b703-2133541e2cf1-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.244365 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwkhm\" (UniqueName: \"kubernetes.io/projected/e94031f9-ac0c-4950-b703-2133541e2cf1-kube-api-access-dwkhm\") on node \"crc\" DevicePath \"\"" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.753143 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" event={"ID":"e94031f9-ac0c-4950-b703-2133541e2cf1","Type":"ContainerDied","Data":"06edd0f3e25769d26ed041a0ca0ba9dc182ae61985bc0d5d8accb14a1d4d514f"} Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.753629 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06edd0f3e25769d26ed041a0ca0ba9dc182ae61985bc0d5d8accb14a1d4d514f" Feb 17 15:32:38 crc kubenswrapper[4806]: I0217 15:32:38.753232 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.403833 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp"] Feb 17 15:32:48 crc kubenswrapper[4806]: E0217 15:32:48.404652 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerName="pull" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.404665 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerName="pull" Feb 17 15:32:48 crc kubenswrapper[4806]: E0217 15:32:48.404681 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerName="extract" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.404687 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerName="extract" Feb 17 15:32:48 crc kubenswrapper[4806]: E0217 15:32:48.404695 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerName="util" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.404702 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerName="util" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.404794 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94031f9-ac0c-4950-b703-2133541e2cf1" containerName="extract" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.405180 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.407125 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-4sx5b" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.408480 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.421102 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp"] Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.494098 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kgzb\" (UniqueName: \"kubernetes.io/projected/4156966d-8f2a-4e04-8484-779309f87ee9-kube-api-access-8kgzb\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.494144 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4156966d-8f2a-4e04-8484-779309f87ee9-webhook-cert\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.494219 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4156966d-8f2a-4e04-8484-779309f87ee9-apiservice-cert\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.595120 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4156966d-8f2a-4e04-8484-779309f87ee9-apiservice-cert\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.595219 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kgzb\" (UniqueName: \"kubernetes.io/projected/4156966d-8f2a-4e04-8484-779309f87ee9-kube-api-access-8kgzb\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.595252 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4156966d-8f2a-4e04-8484-779309f87ee9-webhook-cert\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.608520 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4156966d-8f2a-4e04-8484-779309f87ee9-webhook-cert\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.611520 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4156966d-8f2a-4e04-8484-779309f87ee9-apiservice-cert\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.628518 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kgzb\" (UniqueName: \"kubernetes.io/projected/4156966d-8f2a-4e04-8484-779309f87ee9-kube-api-access-8kgzb\") pod \"infra-operator-controller-manager-8464bf4b7b-r2bsp\" (UID: \"4156966d-8f2a-4e04-8484-779309f87ee9\") " pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:48 crc kubenswrapper[4806]: I0217 15:32:48.726173 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:49 crc kubenswrapper[4806]: I0217 15:32:49.262166 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp"] Feb 17 15:32:49 crc kubenswrapper[4806]: I0217 15:32:49.827328 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" event={"ID":"4156966d-8f2a-4e04-8484-779309f87ee9","Type":"ContainerStarted","Data":"5aafddb31ba450e922556c486b157e8cea9c443e02753265fc8168f7584d2fa6"} Feb 17 15:32:51 crc kubenswrapper[4806]: I0217 15:32:51.841684 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" event={"ID":"4156966d-8f2a-4e04-8484-779309f87ee9","Type":"ContainerStarted","Data":"50835066a65ea590e72cb34ef9d824a2fc56ab81627d768c8edb9eeb79ef6bb1"} Feb 17 15:32:51 crc kubenswrapper[4806]: I0217 15:32:51.842102 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:32:51 crc kubenswrapper[4806]: I0217 15:32:51.864397 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" podStartSLOduration=2.013265313 podStartE2EDuration="3.864374336s" podCreationTimestamp="2026-02-17 15:32:48 +0000 UTC" firstStartedPulling="2026-02-17 15:32:49.271057728 +0000 UTC m=+730.801688139" lastFinishedPulling="2026-02-17 15:32:51.122166751 +0000 UTC m=+732.652797162" observedRunningTime="2026-02-17 15:32:51.862874269 +0000 UTC m=+733.393504700" watchObservedRunningTime="2026-02-17 15:32:51.864374336 +0000 UTC m=+733.395004757" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.220715 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.223682 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.230085 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.230976 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.231325 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-m6x24" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.231751 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.231931 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.246133 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.247377 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.252383 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.253756 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.258671 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.264135 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.269473 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.373845 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.373903 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-kolla-config\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.373942 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.373970 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-operator-scripts\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.373999 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64071ec4-119f-4213-9fc1-d7d9e665ca53-config-data-generated\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374023 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-kolla-config\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374053 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374081 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cec2c43b-a867-4933-ba16-78ec075c6671-config-data-generated\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374110 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr4hw\" (UniqueName: \"kubernetes.io/projected/cec2c43b-a867-4933-ba16-78ec075c6671-kube-api-access-fr4hw\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374135 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-operator-scripts\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374211 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-config-data-default\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374297 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3d61fdf-8cbc-400a-ab38-7ee67a131849-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374334 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqqz\" (UniqueName: \"kubernetes.io/projected/64071ec4-119f-4213-9fc1-d7d9e665ca53-kube-api-access-zbqqz\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374394 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-kolla-config\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374486 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374509 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9pts\" (UniqueName: \"kubernetes.io/projected/f3d61fdf-8cbc-400a-ab38-7ee67a131849-kube-api-access-w9pts\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374525 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-config-data-default\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.374540 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-config-data-default\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.476078 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.476154 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cec2c43b-a867-4933-ba16-78ec075c6671-config-data-generated\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.476199 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr4hw\" (UniqueName: \"kubernetes.io/projected/cec2c43b-a867-4933-ba16-78ec075c6671-kube-api-access-fr4hw\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.476235 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-operator-scripts\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.476267 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-config-data-default\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.476848 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.477936 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-config-data-default\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.477899 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3d61fdf-8cbc-400a-ab38-7ee67a131849-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478052 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqqz\" (UniqueName: \"kubernetes.io/projected/64071ec4-119f-4213-9fc1-d7d9e665ca53-kube-api-access-zbqqz\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478121 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-kolla-config\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478170 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.477459 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cec2c43b-a867-4933-ba16-78ec075c6671-config-data-generated\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478208 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9pts\" (UniqueName: \"kubernetes.io/projected/f3d61fdf-8cbc-400a-ab38-7ee67a131849-kube-api-access-w9pts\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478290 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f3d61fdf-8cbc-400a-ab38-7ee67a131849-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478340 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-config-data-default\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478391 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-config-data-default\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478493 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478551 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-kolla-config\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478635 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478699 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-operator-scripts\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478760 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64071ec4-119f-4213-9fc1-d7d9e665ca53-config-data-generated\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478781 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478806 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-kolla-config\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.478977 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.479938 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-kolla-config\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.480069 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.480172 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-config-data-default\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.480679 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-kolla-config\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.480884 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-config-data-default\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.480893 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cec2c43b-a867-4933-ba16-78ec075c6671-operator-scripts\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.481306 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3d61fdf-8cbc-400a-ab38-7ee67a131849-kolla-config\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.481325 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64071ec4-119f-4213-9fc1-d7d9e665ca53-config-data-generated\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.482609 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64071ec4-119f-4213-9fc1-d7d9e665ca53-operator-scripts\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.503203 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.504692 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr4hw\" (UniqueName: \"kubernetes.io/projected/cec2c43b-a867-4933-ba16-78ec075c6671-kube-api-access-fr4hw\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.509344 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqqz\" (UniqueName: \"kubernetes.io/projected/64071ec4-119f-4213-9fc1-d7d9e665ca53-kube-api-access-zbqqz\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.512022 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"openstack-galera-1\" (UID: \"cec2c43b-a867-4933-ba16-78ec075c6671\") " pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.512071 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"64071ec4-119f-4213-9fc1-d7d9e665ca53\") " pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.516177 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9pts\" (UniqueName: \"kubernetes.io/projected/f3d61fdf-8cbc-400a-ab38-7ee67a131849-kube-api-access-w9pts\") pod \"openstack-galera-2\" (UID: \"f3d61fdf-8cbc-400a-ab38-7ee67a131849\") " pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.550777 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.572607 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:32:54 crc kubenswrapper[4806]: I0217 15:32:54.583787 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:32:55 crc kubenswrapper[4806]: I0217 15:32:55.005892 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Feb 17 15:32:55 crc kubenswrapper[4806]: W0217 15:32:55.007817 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64071ec4_119f_4213_9fc1_d7d9e665ca53.slice/crio-7b7120cbf1542c6956d4834ade428587e2334a76b354fdac5cb7f3c534cdc724 WatchSource:0}: Error finding container 7b7120cbf1542c6956d4834ade428587e2334a76b354fdac5cb7f3c534cdc724: Status 404 returned error can't find the container with id 7b7120cbf1542c6956d4834ade428587e2334a76b354fdac5cb7f3c534cdc724 Feb 17 15:32:55 crc kubenswrapper[4806]: I0217 15:32:55.111690 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Feb 17 15:32:55 crc kubenswrapper[4806]: I0217 15:32:55.124843 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Feb 17 15:32:55 crc kubenswrapper[4806]: I0217 15:32:55.867451 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f3d61fdf-8cbc-400a-ab38-7ee67a131849","Type":"ContainerStarted","Data":"d06c0986bb52758cfd1c785868039b7679e3d09e58902813a5770816fab9ecfa"} Feb 17 15:32:55 crc kubenswrapper[4806]: I0217 15:32:55.871002 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"cec2c43b-a867-4933-ba16-78ec075c6671","Type":"ContainerStarted","Data":"935f9ae02e8c947fe5154cf360124a03e9c564a06393146a517d18da280ba533"} Feb 17 15:32:55 crc kubenswrapper[4806]: I0217 15:32:55.873971 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"64071ec4-119f-4213-9fc1-d7d9e665ca53","Type":"ContainerStarted","Data":"7b7120cbf1542c6956d4834ade428587e2334a76b354fdac5cb7f3c534cdc724"} Feb 17 15:32:58 crc kubenswrapper[4806]: I0217 15:32:58.732081 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-8464bf4b7b-r2bsp" Feb 17 15:33:03 crc kubenswrapper[4806]: I0217 15:33:03.925693 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"cec2c43b-a867-4933-ba16-78ec075c6671","Type":"ContainerStarted","Data":"05bb60df1e289864e9fe12da21c8c2dac205f12e97d6fefc72f820f07883f931"} Feb 17 15:33:03 crc kubenswrapper[4806]: I0217 15:33:03.928241 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"64071ec4-119f-4213-9fc1-d7d9e665ca53","Type":"ContainerStarted","Data":"9a9a78fcefbf8fdb3cd37615a03c5d3e13fe9c1e806dacd2eddd9ee9b5cb9cb4"} Feb 17 15:33:03 crc kubenswrapper[4806]: I0217 15:33:03.929838 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f3d61fdf-8cbc-400a-ab38-7ee67a131849","Type":"ContainerStarted","Data":"b84a89242552836ba878f0511205c508fc146b4bc89a3d31a1a39ea3f12c6720"} Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.040310 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.041267 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.043794 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-xkn59" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.043862 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.051614 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.121970 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36964ae0-c931-4256-9a0d-55e56cf16b33-kolla-config\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.122018 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36964ae0-c931-4256-9a0d-55e56cf16b33-config-data\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.122039 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vjxk\" (UniqueName: \"kubernetes.io/projected/36964ae0-c931-4256-9a0d-55e56cf16b33-kube-api-access-4vjxk\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.223544 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36964ae0-c931-4256-9a0d-55e56cf16b33-kolla-config\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.223600 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36964ae0-c931-4256-9a0d-55e56cf16b33-config-data\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.223628 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vjxk\" (UniqueName: \"kubernetes.io/projected/36964ae0-c931-4256-9a0d-55e56cf16b33-kube-api-access-4vjxk\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.224930 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36964ae0-c931-4256-9a0d-55e56cf16b33-kolla-config\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.225545 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36964ae0-c931-4256-9a0d-55e56cf16b33-config-data\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.256858 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vjxk\" (UniqueName: \"kubernetes.io/projected/36964ae0-c931-4256-9a0d-55e56cf16b33-kube-api-access-4vjxk\") pod \"memcached-0\" (UID: \"36964ae0-c931-4256-9a0d-55e56cf16b33\") " pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.362432 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.879741 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.936204 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"36964ae0-c931-4256-9a0d-55e56cf16b33","Type":"ContainerStarted","Data":"1549ca123c753831cb55aa7870f08f8fd9d56f1e493b070bb06c2897b21c260f"} Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.970151 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pxn5j"] Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.971039 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.973036 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-6x8s5" Feb 17 15:33:04 crc kubenswrapper[4806]: I0217 15:33:04.976643 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pxn5j"] Feb 17 15:33:05 crc kubenswrapper[4806]: I0217 15:33:05.037523 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vz7v\" (UniqueName: \"kubernetes.io/projected/31eb196b-7a2c-4681-b089-413d4d0d8c8d-kube-api-access-5vz7v\") pod \"rabbitmq-cluster-operator-index-pxn5j\" (UID: \"31eb196b-7a2c-4681-b089-413d4d0d8c8d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" Feb 17 15:33:05 crc kubenswrapper[4806]: I0217 15:33:05.139053 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vz7v\" (UniqueName: \"kubernetes.io/projected/31eb196b-7a2c-4681-b089-413d4d0d8c8d-kube-api-access-5vz7v\") pod \"rabbitmq-cluster-operator-index-pxn5j\" (UID: \"31eb196b-7a2c-4681-b089-413d4d0d8c8d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" Feb 17 15:33:05 crc kubenswrapper[4806]: I0217 15:33:05.171482 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vz7v\" (UniqueName: \"kubernetes.io/projected/31eb196b-7a2c-4681-b089-413d4d0d8c8d-kube-api-access-5vz7v\") pod \"rabbitmq-cluster-operator-index-pxn5j\" (UID: \"31eb196b-7a2c-4681-b089-413d4d0d8c8d\") " pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" Feb 17 15:33:05 crc kubenswrapper[4806]: I0217 15:33:05.291771 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" Feb 17 15:33:05 crc kubenswrapper[4806]: I0217 15:33:05.804863 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pxn5j"] Feb 17 15:33:05 crc kubenswrapper[4806]: I0217 15:33:05.945515 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" event={"ID":"31eb196b-7a2c-4681-b089-413d4d0d8c8d","Type":"ContainerStarted","Data":"63a9c7893f5a6230fd42c55b2eae5f1e5ba56bf04eb3b265c2e07c1a3534c286"} Feb 17 15:33:07 crc kubenswrapper[4806]: I0217 15:33:07.960244 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"36964ae0-c931-4256-9a0d-55e56cf16b33","Type":"ContainerStarted","Data":"7ab5cfe76bb3cc23db89cf67a57e53b56f15d2ece79362e8672f192d7af3f342"} Feb 17 15:33:07 crc kubenswrapper[4806]: I0217 15:33:07.960997 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:07 crc kubenswrapper[4806]: I0217 15:33:07.977625 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=1.338781366 podStartE2EDuration="3.977610374s" podCreationTimestamp="2026-02-17 15:33:04 +0000 UTC" firstStartedPulling="2026-02-17 15:33:04.895322028 +0000 UTC m=+746.425952459" lastFinishedPulling="2026-02-17 15:33:07.534151056 +0000 UTC m=+749.064781467" observedRunningTime="2026-02-17 15:33:07.977370868 +0000 UTC m=+749.508001299" watchObservedRunningTime="2026-02-17 15:33:07.977610374 +0000 UTC m=+749.508240775" Feb 17 15:33:08 crc kubenswrapper[4806]: I0217 15:33:08.967188 4806 generic.go:334] "Generic (PLEG): container finished" podID="cec2c43b-a867-4933-ba16-78ec075c6671" containerID="05bb60df1e289864e9fe12da21c8c2dac205f12e97d6fefc72f820f07883f931" exitCode=0 Feb 17 15:33:08 crc kubenswrapper[4806]: I0217 15:33:08.967240 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"cec2c43b-a867-4933-ba16-78ec075c6671","Type":"ContainerDied","Data":"05bb60df1e289864e9fe12da21c8c2dac205f12e97d6fefc72f820f07883f931"} Feb 17 15:33:08 crc kubenswrapper[4806]: I0217 15:33:08.975839 4806 generic.go:334] "Generic (PLEG): container finished" podID="64071ec4-119f-4213-9fc1-d7d9e665ca53" containerID="9a9a78fcefbf8fdb3cd37615a03c5d3e13fe9c1e806dacd2eddd9ee9b5cb9cb4" exitCode=0 Feb 17 15:33:08 crc kubenswrapper[4806]: I0217 15:33:08.975929 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"64071ec4-119f-4213-9fc1-d7d9e665ca53","Type":"ContainerDied","Data":"9a9a78fcefbf8fdb3cd37615a03c5d3e13fe9c1e806dacd2eddd9ee9b5cb9cb4"} Feb 17 15:33:08 crc kubenswrapper[4806]: I0217 15:33:08.989006 4806 generic.go:334] "Generic (PLEG): container finished" podID="f3d61fdf-8cbc-400a-ab38-7ee67a131849" containerID="b84a89242552836ba878f0511205c508fc146b4bc89a3d31a1a39ea3f12c6720" exitCode=0 Feb 17 15:33:08 crc kubenswrapper[4806]: I0217 15:33:08.989850 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f3d61fdf-8cbc-400a-ab38-7ee67a131849","Type":"ContainerDied","Data":"b84a89242552836ba878f0511205c508fc146b4bc89a3d31a1a39ea3f12c6720"} Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.150206 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pxn5j"] Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.762518 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5fnnd"] Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.763740 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.765018 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5fnnd"] Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.805838 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jq4x\" (UniqueName: \"kubernetes.io/projected/38e7696e-97ac-4b38-9cd2-2e5e902aeb43-kube-api-access-4jq4x\") pod \"rabbitmq-cluster-operator-index-5fnnd\" (UID: \"38e7696e-97ac-4b38-9cd2-2e5e902aeb43\") " pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.907242 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jq4x\" (UniqueName: \"kubernetes.io/projected/38e7696e-97ac-4b38-9cd2-2e5e902aeb43-kube-api-access-4jq4x\") pod \"rabbitmq-cluster-operator-index-5fnnd\" (UID: \"38e7696e-97ac-4b38-9cd2-2e5e902aeb43\") " pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.948254 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jq4x\" (UniqueName: \"kubernetes.io/projected/38e7696e-97ac-4b38-9cd2-2e5e902aeb43-kube-api-access-4jq4x\") pod \"rabbitmq-cluster-operator-index-5fnnd\" (UID: \"38e7696e-97ac-4b38-9cd2-2e5e902aeb43\") " pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.996102 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"cec2c43b-a867-4933-ba16-78ec075c6671","Type":"ContainerStarted","Data":"cbb0633693a1cb91b8e43f675832f33045d293497e2a63010d4c66e8d054d510"} Feb 17 15:33:09 crc kubenswrapper[4806]: I0217 15:33:09.999251 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"64071ec4-119f-4213-9fc1-d7d9e665ca53","Type":"ContainerStarted","Data":"df77c077710a1e71ceaac174de186cc1322e32e4e2e99fb96c9419bafbb674a7"} Feb 17 15:33:10 crc kubenswrapper[4806]: I0217 15:33:10.003828 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"f3d61fdf-8cbc-400a-ab38-7ee67a131849","Type":"ContainerStarted","Data":"b243de942a4a17e1411216ef3fa39c3fc3a55b14f6403da91e257d24630e8903"} Feb 17 15:33:10 crc kubenswrapper[4806]: I0217 15:33:10.016184 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=8.688638368 podStartE2EDuration="17.016164331s" podCreationTimestamp="2026-02-17 15:32:53 +0000 UTC" firstStartedPulling="2026-02-17 15:32:55.126928109 +0000 UTC m=+736.657558530" lastFinishedPulling="2026-02-17 15:33:03.454454082 +0000 UTC m=+744.985084493" observedRunningTime="2026-02-17 15:33:10.013639199 +0000 UTC m=+751.544269610" watchObservedRunningTime="2026-02-17 15:33:10.016164331 +0000 UTC m=+751.546794742" Feb 17 15:33:10 crc kubenswrapper[4806]: I0217 15:33:10.040989 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=8.670010224 podStartE2EDuration="17.040967176s" podCreationTimestamp="2026-02-17 15:32:53 +0000 UTC" firstStartedPulling="2026-02-17 15:32:55.134682738 +0000 UTC m=+736.665313149" lastFinishedPulling="2026-02-17 15:33:03.50563969 +0000 UTC m=+745.036270101" observedRunningTime="2026-02-17 15:33:10.030929331 +0000 UTC m=+751.561559752" watchObservedRunningTime="2026-02-17 15:33:10.040967176 +0000 UTC m=+751.571597597" Feb 17 15:33:10 crc kubenswrapper[4806]: I0217 15:33:10.050161 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=8.657708393 podStartE2EDuration="17.050141849s" podCreationTimestamp="2026-02-17 15:32:53 +0000 UTC" firstStartedPulling="2026-02-17 15:32:55.010221212 +0000 UTC m=+736.540851623" lastFinishedPulling="2026-02-17 15:33:03.402654668 +0000 UTC m=+744.933285079" observedRunningTime="2026-02-17 15:33:10.045516136 +0000 UTC m=+751.576146557" watchObservedRunningTime="2026-02-17 15:33:10.050141849 +0000 UTC m=+751.580772260" Feb 17 15:33:10 crc kubenswrapper[4806]: I0217 15:33:10.083762 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:10 crc kubenswrapper[4806]: I0217 15:33:10.532077 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-5fnnd"] Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.009896 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" event={"ID":"38e7696e-97ac-4b38-9cd2-2e5e902aeb43","Type":"ContainerStarted","Data":"83a15dfba480baf4ba26efbdb8a24fcc92f93c72ed96cc942ee654b534fc7b45"} Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.010709 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" event={"ID":"31eb196b-7a2c-4681-b089-413d4d0d8c8d","Type":"ContainerStarted","Data":"feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c"} Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.010827 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" podUID="31eb196b-7a2c-4681-b089-413d4d0d8c8d" containerName="registry-server" containerID="cri-o://feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c" gracePeriod=2 Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.031457 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" podStartSLOduration=2.770704603 podStartE2EDuration="7.031438645s" podCreationTimestamp="2026-02-17 15:33:04 +0000 UTC" firstStartedPulling="2026-02-17 15:33:05.815205387 +0000 UTC m=+747.345835798" lastFinishedPulling="2026-02-17 15:33:10.075939429 +0000 UTC m=+751.606569840" observedRunningTime="2026-02-17 15:33:11.027799376 +0000 UTC m=+752.558429787" watchObservedRunningTime="2026-02-17 15:33:11.031438645 +0000 UTC m=+752.562069076" Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.459900 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" Feb 17 15:33:11 crc kubenswrapper[4806]: E0217 15:33:11.464162 4806 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=1297504516997507890, SKID=, AKID=46:26:62:08:33:70:E0:77:D2:73:20:6A:8D:C3:4A:83:C2:EC:2E:C9 failed: x509: certificate signed by unknown authority" Feb 17 15:33:11 crc kubenswrapper[4806]: E0217 15:33:11.481664 4806 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=1297504516997507890, SKID=, AKID=46:26:62:08:33:70:E0:77:D2:73:20:6A:8D:C3:4A:83:C2:EC:2E:C9 failed: x509: certificate signed by unknown authority" Feb 17 15:33:11 crc kubenswrapper[4806]: E0217 15:33:11.495262 4806 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=1297504516997507890, SKID=, AKID=46:26:62:08:33:70:E0:77:D2:73:20:6A:8D:C3:4A:83:C2:EC:2E:C9 failed: x509: certificate signed by unknown authority" Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.530013 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vz7v\" (UniqueName: \"kubernetes.io/projected/31eb196b-7a2c-4681-b089-413d4d0d8c8d-kube-api-access-5vz7v\") pod \"31eb196b-7a2c-4681-b089-413d4d0d8c8d\" (UID: \"31eb196b-7a2c-4681-b089-413d4d0d8c8d\") " Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.543690 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31eb196b-7a2c-4681-b089-413d4d0d8c8d-kube-api-access-5vz7v" (OuterVolumeSpecName: "kube-api-access-5vz7v") pod "31eb196b-7a2c-4681-b089-413d4d0d8c8d" (UID: "31eb196b-7a2c-4681-b089-413d4d0d8c8d"). InnerVolumeSpecName "kube-api-access-5vz7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:33:11 crc kubenswrapper[4806]: I0217 15:33:11.631871 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vz7v\" (UniqueName: \"kubernetes.io/projected/31eb196b-7a2c-4681-b089-413d4d0d8c8d-kube-api-access-5vz7v\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.017373 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" event={"ID":"38e7696e-97ac-4b38-9cd2-2e5e902aeb43","Type":"ContainerStarted","Data":"5fcb3138727e8d7e659483c3a04fa838d205a5d580d1a22f939f8605036aff9a"} Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.020251 4806 generic.go:334] "Generic (PLEG): container finished" podID="31eb196b-7a2c-4681-b089-413d4d0d8c8d" containerID="feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c" exitCode=0 Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.020501 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.020533 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" event={"ID":"31eb196b-7a2c-4681-b089-413d4d0d8c8d","Type":"ContainerDied","Data":"feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c"} Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.020804 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-pxn5j" event={"ID":"31eb196b-7a2c-4681-b089-413d4d0d8c8d","Type":"ContainerDied","Data":"63a9c7893f5a6230fd42c55b2eae5f1e5ba56bf04eb3b265c2e07c1a3534c286"} Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.020835 4806 scope.go:117] "RemoveContainer" containerID="feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c" Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.045001 4806 scope.go:117] "RemoveContainer" containerID="feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c" Feb 17 15:33:12 crc kubenswrapper[4806]: E0217 15:33:12.045555 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c\": container with ID starting with feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c not found: ID does not exist" containerID="feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c" Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.045600 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c"} err="failed to get container status \"feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c\": rpc error: code = NotFound desc = could not find container \"feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c\": container with ID starting with feb6aceaa27db2f45e7c7b498b59f82560df93c8b73c5d7f723ff166d73f450c not found: ID does not exist" Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.055879 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" podStartSLOduration=2.41038239 podStartE2EDuration="3.055855904s" podCreationTimestamp="2026-02-17 15:33:09 +0000 UTC" firstStartedPulling="2026-02-17 15:33:10.534457633 +0000 UTC m=+752.065088044" lastFinishedPulling="2026-02-17 15:33:11.179931147 +0000 UTC m=+752.710561558" observedRunningTime="2026-02-17 15:33:12.035713763 +0000 UTC m=+753.566344174" watchObservedRunningTime="2026-02-17 15:33:12.055855904 +0000 UTC m=+753.586486315" Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.059592 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pxn5j"] Feb 17 15:33:12 crc kubenswrapper[4806]: I0217 15:33:12.064336 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-pxn5j"] Feb 17 15:33:13 crc kubenswrapper[4806]: I0217 15:33:13.169263 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31eb196b-7a2c-4681-b089-413d4d0d8c8d" path="/var/lib/kubelet/pods/31eb196b-7a2c-4681-b089-413d4d0d8c8d/volumes" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.279910 4806 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.364175 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.551172 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.551223 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.582885 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.582931 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.585274 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:33:14 crc kubenswrapper[4806]: I0217 15:33:14.585844 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:33:20 crc kubenswrapper[4806]: I0217 15:33:20.083992 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:20 crc kubenswrapper[4806]: I0217 15:33:20.084534 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:20 crc kubenswrapper[4806]: I0217 15:33:20.118694 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:20 crc kubenswrapper[4806]: I0217 15:33:20.669046 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:33:20 crc kubenswrapper[4806]: I0217 15:33:20.741091 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Feb 17 15:33:21 crc kubenswrapper[4806]: I0217 15:33:21.108300 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-5fnnd" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.209093 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-zfdgv"] Feb 17 15:33:23 crc kubenswrapper[4806]: E0217 15:33:23.209687 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eb196b-7a2c-4681-b089-413d4d0d8c8d" containerName="registry-server" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.209701 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eb196b-7a2c-4681-b089-413d4d0d8c8d" containerName="registry-server" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.209816 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="31eb196b-7a2c-4681-b089-413d4d0d8c8d" containerName="registry-server" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.210207 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.212604 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.221828 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-zfdgv"] Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.282110 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-operator-scripts\") pod \"root-account-create-update-zfdgv\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.282750 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxcbb\" (UniqueName: \"kubernetes.io/projected/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-kube-api-access-xxcbb\") pod \"root-account-create-update-zfdgv\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.384468 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxcbb\" (UniqueName: \"kubernetes.io/projected/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-kube-api-access-xxcbb\") pod \"root-account-create-update-zfdgv\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.384584 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-operator-scripts\") pod \"root-account-create-update-zfdgv\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.386050 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-operator-scripts\") pod \"root-account-create-update-zfdgv\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.407916 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxcbb\" (UniqueName: \"kubernetes.io/projected/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-kube-api-access-xxcbb\") pod \"root-account-create-update-zfdgv\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:23 crc kubenswrapper[4806]: I0217 15:33:23.531720 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:24 crc kubenswrapper[4806]: I0217 15:33:24.647392 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="f3d61fdf-8cbc-400a-ab38-7ee67a131849" containerName="galera" probeResult="failure" output=< Feb 17 15:33:24 crc kubenswrapper[4806]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Feb 17 15:33:24 crc kubenswrapper[4806]: > Feb 17 15:33:25 crc kubenswrapper[4806]: I0217 15:33:25.627327 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-zfdgv"] Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.121672 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-zfdgv" event={"ID":"2d8cdc68-a20d-4a5c-8561-a6245c4277cc","Type":"ContainerStarted","Data":"3eb6e59e42aae0771c9fb08320d365359714ff7f682113024e3a2e91a63ca1ef"} Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.121733 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-zfdgv" event={"ID":"2d8cdc68-a20d-4a5c-8561-a6245c4277cc","Type":"ContainerStarted","Data":"c2ac815498ff99b65abce3fd327287ee732a7a22fe99f81a99d0fadcf219152c"} Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.139055 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/root-account-create-update-zfdgv" podStartSLOduration=3.139034304 podStartE2EDuration="3.139034304s" podCreationTimestamp="2026-02-17 15:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:33:26.13559021 +0000 UTC m=+767.666220641" watchObservedRunningTime="2026-02-17 15:33:26.139034304 +0000 UTC m=+767.669664715" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.599650 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk"] Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.601910 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.605886 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk"] Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.643084 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5p4j2" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.745002 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76pzr\" (UniqueName: \"kubernetes.io/projected/da3c84f4-8168-4262-8636-79b9c7bd7d4d-kube-api-access-76pzr\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.745057 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.745086 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.846696 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76pzr\" (UniqueName: \"kubernetes.io/projected/da3c84f4-8168-4262-8636-79b9c7bd7d4d-kube-api-access-76pzr\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.846769 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.846812 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.847455 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.847455 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.866134 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76pzr\" (UniqueName: \"kubernetes.io/projected/da3c84f4-8168-4262-8636-79b9c7bd7d4d-kube-api-access-76pzr\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:26 crc kubenswrapper[4806]: I0217 15:33:26.967462 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:27 crc kubenswrapper[4806]: I0217 15:33:27.210395 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk"] Feb 17 15:33:28 crc kubenswrapper[4806]: I0217 15:33:28.155496 4806 generic.go:334] "Generic (PLEG): container finished" podID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerID="c2bc33f3752766e69f27325d508d366cc5737635657616d88e9cb6e2e0819b3b" exitCode=0 Feb 17 15:33:28 crc kubenswrapper[4806]: I0217 15:33:28.155561 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" event={"ID":"da3c84f4-8168-4262-8636-79b9c7bd7d4d","Type":"ContainerDied","Data":"c2bc33f3752766e69f27325d508d366cc5737635657616d88e9cb6e2e0819b3b"} Feb 17 15:33:28 crc kubenswrapper[4806]: I0217 15:33:28.155802 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" event={"ID":"da3c84f4-8168-4262-8636-79b9c7bd7d4d","Type":"ContainerStarted","Data":"4670857b5db9ed7009dd4798e026aa4509557253cd4b2aacbd39ebbe67e279c5"} Feb 17 15:33:28 crc kubenswrapper[4806]: I0217 15:33:28.158105 4806 generic.go:334] "Generic (PLEG): container finished" podID="2d8cdc68-a20d-4a5c-8561-a6245c4277cc" containerID="3eb6e59e42aae0771c9fb08320d365359714ff7f682113024e3a2e91a63ca1ef" exitCode=0 Feb 17 15:33:28 crc kubenswrapper[4806]: I0217 15:33:28.158156 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-zfdgv" event={"ID":"2d8cdc68-a20d-4a5c-8561-a6245c4277cc","Type":"ContainerDied","Data":"3eb6e59e42aae0771c9fb08320d365359714ff7f682113024e3a2e91a63ca1ef"} Feb 17 15:33:29 crc kubenswrapper[4806]: I0217 15:33:29.573340 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:29 crc kubenswrapper[4806]: I0217 15:33:29.683130 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxcbb\" (UniqueName: \"kubernetes.io/projected/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-kube-api-access-xxcbb\") pod \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " Feb 17 15:33:29 crc kubenswrapper[4806]: I0217 15:33:29.683175 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-operator-scripts\") pod \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\" (UID: \"2d8cdc68-a20d-4a5c-8561-a6245c4277cc\") " Feb 17 15:33:29 crc kubenswrapper[4806]: I0217 15:33:29.684157 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d8cdc68-a20d-4a5c-8561-a6245c4277cc" (UID: "2d8cdc68-a20d-4a5c-8561-a6245c4277cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:33:29 crc kubenswrapper[4806]: I0217 15:33:29.698551 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-kube-api-access-xxcbb" (OuterVolumeSpecName: "kube-api-access-xxcbb") pod "2d8cdc68-a20d-4a5c-8561-a6245c4277cc" (UID: "2d8cdc68-a20d-4a5c-8561-a6245c4277cc"). InnerVolumeSpecName "kube-api-access-xxcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:33:29 crc kubenswrapper[4806]: I0217 15:33:29.784486 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:29 crc kubenswrapper[4806]: I0217 15:33:29.784521 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxcbb\" (UniqueName: \"kubernetes.io/projected/2d8cdc68-a20d-4a5c-8561-a6245c4277cc-kube-api-access-xxcbb\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:30 crc kubenswrapper[4806]: I0217 15:33:30.175046 4806 generic.go:334] "Generic (PLEG): container finished" podID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerID="01f3927b8a4a410697b16f2094899f49237e1cff017076ed78365858d5b63481" exitCode=0 Feb 17 15:33:30 crc kubenswrapper[4806]: I0217 15:33:30.175149 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" event={"ID":"da3c84f4-8168-4262-8636-79b9c7bd7d4d","Type":"ContainerDied","Data":"01f3927b8a4a410697b16f2094899f49237e1cff017076ed78365858d5b63481"} Feb 17 15:33:30 crc kubenswrapper[4806]: I0217 15:33:30.178300 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-zfdgv" event={"ID":"2d8cdc68-a20d-4a5c-8561-a6245c4277cc","Type":"ContainerDied","Data":"c2ac815498ff99b65abce3fd327287ee732a7a22fe99f81a99d0fadcf219152c"} Feb 17 15:33:30 crc kubenswrapper[4806]: I0217 15:33:30.178357 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ac815498ff99b65abce3fd327287ee732a7a22fe99f81a99d0fadcf219152c" Feb 17 15:33:30 crc kubenswrapper[4806]: I0217 15:33:30.178430 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-zfdgv" Feb 17 15:33:31 crc kubenswrapper[4806]: I0217 15:33:31.188279 4806 generic.go:334] "Generic (PLEG): container finished" podID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerID="dbaf6f417a717487f8eb44fd703bac1002d7b28b01656b2cefcb3d2e99ffee29" exitCode=0 Feb 17 15:33:31 crc kubenswrapper[4806]: I0217 15:33:31.188462 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" event={"ID":"da3c84f4-8168-4262-8636-79b9c7bd7d4d","Type":"ContainerDied","Data":"dbaf6f417a717487f8eb44fd703bac1002d7b28b01656b2cefcb3d2e99ffee29"} Feb 17 15:33:31 crc kubenswrapper[4806]: I0217 15:33:31.511375 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:33:31 crc kubenswrapper[4806]: I0217 15:33:31.606968 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.535452 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.621581 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-bundle\") pod \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.621666 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-util\") pod \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.621767 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76pzr\" (UniqueName: \"kubernetes.io/projected/da3c84f4-8168-4262-8636-79b9c7bd7d4d-kube-api-access-76pzr\") pod \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\" (UID: \"da3c84f4-8168-4262-8636-79b9c7bd7d4d\") " Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.624341 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-bundle" (OuterVolumeSpecName: "bundle") pod "da3c84f4-8168-4262-8636-79b9c7bd7d4d" (UID: "da3c84f4-8168-4262-8636-79b9c7bd7d4d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.628628 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3c84f4-8168-4262-8636-79b9c7bd7d4d-kube-api-access-76pzr" (OuterVolumeSpecName: "kube-api-access-76pzr") pod "da3c84f4-8168-4262-8636-79b9c7bd7d4d" (UID: "da3c84f4-8168-4262-8636-79b9c7bd7d4d"). InnerVolumeSpecName "kube-api-access-76pzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.635465 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-util" (OuterVolumeSpecName: "util") pod "da3c84f4-8168-4262-8636-79b9c7bd7d4d" (UID: "da3c84f4-8168-4262-8636-79b9c7bd7d4d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.723966 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76pzr\" (UniqueName: \"kubernetes.io/projected/da3c84f4-8168-4262-8636-79b9c7bd7d4d-kube-api-access-76pzr\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.724003 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:32 crc kubenswrapper[4806]: I0217 15:33:32.724011 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3c84f4-8168-4262-8636-79b9c7bd7d4d-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:33:33 crc kubenswrapper[4806]: I0217 15:33:33.206955 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" event={"ID":"da3c84f4-8168-4262-8636-79b9c7bd7d4d","Type":"ContainerDied","Data":"4670857b5db9ed7009dd4798e026aa4509557253cd4b2aacbd39ebbe67e279c5"} Feb 17 15:33:33 crc kubenswrapper[4806]: I0217 15:33:33.207011 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4670857b5db9ed7009dd4798e026aa4509557253cd4b2aacbd39ebbe67e279c5" Feb 17 15:33:33 crc kubenswrapper[4806]: I0217 15:33:33.207091 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk" Feb 17 15:33:34 crc kubenswrapper[4806]: I0217 15:33:34.784648 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:33:34 crc kubenswrapper[4806]: I0217 15:33:34.784963 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:33:35 crc kubenswrapper[4806]: I0217 15:33:35.106259 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:33:35 crc kubenswrapper[4806]: I0217 15:33:35.182991 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.870334 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp"] Feb 17 15:33:41 crc kubenswrapper[4806]: E0217 15:33:41.871362 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8cdc68-a20d-4a5c-8561-a6245c4277cc" containerName="mariadb-account-create-update" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.871383 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8cdc68-a20d-4a5c-8561-a6245c4277cc" containerName="mariadb-account-create-update" Feb 17 15:33:41 crc kubenswrapper[4806]: E0217 15:33:41.871595 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerName="extract" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.871631 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerName="extract" Feb 17 15:33:41 crc kubenswrapper[4806]: E0217 15:33:41.871669 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerName="pull" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.871677 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerName="pull" Feb 17 15:33:41 crc kubenswrapper[4806]: E0217 15:33:41.871693 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerName="util" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.871698 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerName="util" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.871932 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3c84f4-8168-4262-8636-79b9c7bd7d4d" containerName="extract" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.871953 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8cdc68-a20d-4a5c-8561-a6245c4277cc" containerName="mariadb-account-create-update" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.872413 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.875048 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-2mfpk" Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.883596 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp"] Feb 17 15:33:41 crc kubenswrapper[4806]: I0217 15:33:41.955812 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlxmf\" (UniqueName: \"kubernetes.io/projected/e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f-kube-api-access-qlxmf\") pod \"rabbitmq-cluster-operator-779fc9694b-j9lkp\" (UID: \"e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" Feb 17 15:33:42 crc kubenswrapper[4806]: I0217 15:33:42.057072 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlxmf\" (UniqueName: \"kubernetes.io/projected/e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f-kube-api-access-qlxmf\") pod \"rabbitmq-cluster-operator-779fc9694b-j9lkp\" (UID: \"e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" Feb 17 15:33:42 crc kubenswrapper[4806]: I0217 15:33:42.076730 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlxmf\" (UniqueName: \"kubernetes.io/projected/e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f-kube-api-access-qlxmf\") pod \"rabbitmq-cluster-operator-779fc9694b-j9lkp\" (UID: \"e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" Feb 17 15:33:42 crc kubenswrapper[4806]: I0217 15:33:42.241764 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" Feb 17 15:33:42 crc kubenswrapper[4806]: I0217 15:33:42.512835 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp"] Feb 17 15:33:43 crc kubenswrapper[4806]: I0217 15:33:43.271807 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" event={"ID":"e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f","Type":"ContainerStarted","Data":"51e46fef317f449f43348845c6ad2a186d9cc209b776efaf135ac475fd207fd0"} Feb 17 15:33:46 crc kubenswrapper[4806]: I0217 15:33:46.293540 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" event={"ID":"e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f","Type":"ContainerStarted","Data":"a795d8cc2462b54df4701306658b3dca4458cbbb8531bcae74ea6abec245a2cd"} Feb 17 15:33:46 crc kubenswrapper[4806]: I0217 15:33:46.320705 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-j9lkp" podStartSLOduration=1.895077862 podStartE2EDuration="5.320685992s" podCreationTimestamp="2026-02-17 15:33:41 +0000 UTC" firstStartedPulling="2026-02-17 15:33:42.528564701 +0000 UTC m=+784.059195112" lastFinishedPulling="2026-02-17 15:33:45.954172831 +0000 UTC m=+787.484803242" observedRunningTime="2026-02-17 15:33:46.31936918 +0000 UTC m=+787.849999651" watchObservedRunningTime="2026-02-17 15:33:46.320685992 +0000 UTC m=+787.851316413" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.770194 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.771536 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.773671 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.773920 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.774067 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.774386 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-22qgk" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.775269 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.787956 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.826920 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.826994 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.827027 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.827084 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skmjk\" (UniqueName: \"kubernetes.io/projected/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-kube-api-access-skmjk\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.827110 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.827160 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.827191 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.827214 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.928022 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skmjk\" (UniqueName: \"kubernetes.io/projected/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-kube-api-access-skmjk\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.928084 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.928126 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.928159 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.929333 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.929897 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.929962 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.929996 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.930324 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.929272 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.930666 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.932505 4806 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.932632 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7afc6e71e70d666ce26b1d2ca6702a479463fac926a1b2e445d6f9f4699bc839/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.935639 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.935734 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.936949 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:52 crc kubenswrapper[4806]: I0217 15:33:52.967835 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skmjk\" (UniqueName: \"kubernetes.io/projected/e00f765b-c8c1-44b2-ad4a-9c17876f7ab4-kube-api-access-skmjk\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:53 crc kubenswrapper[4806]: I0217 15:33:53.009186 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b99c4721-effa-4be8-a8fe-07b65978d92d\") pod \"rabbitmq-server-0\" (UID: \"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4\") " pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:53 crc kubenswrapper[4806]: I0217 15:33:53.093767 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:33:53 crc kubenswrapper[4806]: I0217 15:33:53.565857 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.360159 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-8gz5n"] Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.361034 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8gz5n" Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.363880 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-bhhsn" Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.368254 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4","Type":"ContainerStarted","Data":"5136d370b9f6c75eff9900e438ee1d0a314e25e2aa02a1d73abc9d7cd4a7fb78"} Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.373939 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-8gz5n"] Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.455446 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzgd\" (UniqueName: \"kubernetes.io/projected/5bfd6817-7026-43f4-81e9-e364980f3f78-kube-api-access-rtzgd\") pod \"keystone-operator-index-8gz5n\" (UID: \"5bfd6817-7026-43f4-81e9-e364980f3f78\") " pod="openstack-operators/keystone-operator-index-8gz5n" Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.556432 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzgd\" (UniqueName: \"kubernetes.io/projected/5bfd6817-7026-43f4-81e9-e364980f3f78-kube-api-access-rtzgd\") pod \"keystone-operator-index-8gz5n\" (UID: \"5bfd6817-7026-43f4-81e9-e364980f3f78\") " pod="openstack-operators/keystone-operator-index-8gz5n" Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.582005 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzgd\" (UniqueName: \"kubernetes.io/projected/5bfd6817-7026-43f4-81e9-e364980f3f78-kube-api-access-rtzgd\") pod \"keystone-operator-index-8gz5n\" (UID: \"5bfd6817-7026-43f4-81e9-e364980f3f78\") " pod="openstack-operators/keystone-operator-index-8gz5n" Feb 17 15:33:54 crc kubenswrapper[4806]: I0217 15:33:54.717343 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8gz5n" Feb 17 15:33:55 crc kubenswrapper[4806]: I0217 15:33:55.133153 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-8gz5n"] Feb 17 15:33:55 crc kubenswrapper[4806]: I0217 15:33:55.378352 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8gz5n" event={"ID":"5bfd6817-7026-43f4-81e9-e364980f3f78","Type":"ContainerStarted","Data":"ac50a2b0886135c23fb1808a662b91f10a0050c7510a37001a118f314f968cef"} Feb 17 15:33:57 crc kubenswrapper[4806]: I0217 15:33:57.392773 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8gz5n" event={"ID":"5bfd6817-7026-43f4-81e9-e364980f3f78","Type":"ContainerStarted","Data":"fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f"} Feb 17 15:33:57 crc kubenswrapper[4806]: I0217 15:33:57.414219 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-8gz5n" podStartSLOduration=1.72340703 podStartE2EDuration="3.414194544s" podCreationTimestamp="2026-02-17 15:33:54 +0000 UTC" firstStartedPulling="2026-02-17 15:33:55.151931811 +0000 UTC m=+796.682562222" lastFinishedPulling="2026-02-17 15:33:56.842719325 +0000 UTC m=+798.373349736" observedRunningTime="2026-02-17 15:33:57.407837239 +0000 UTC m=+798.938467660" watchObservedRunningTime="2026-02-17 15:33:57.414194544 +0000 UTC m=+798.944824955" Feb 17 15:33:58 crc kubenswrapper[4806]: I0217 15:33:58.356527 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-8gz5n"] Feb 17 15:33:58 crc kubenswrapper[4806]: I0217 15:33:58.959973 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-qdvh5"] Feb 17 15:33:58 crc kubenswrapper[4806]: I0217 15:33:58.961087 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:33:58 crc kubenswrapper[4806]: I0217 15:33:58.976808 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-qdvh5"] Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.024067 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnwc\" (UniqueName: \"kubernetes.io/projected/0528fe97-ef52-441d-9e22-bf89676e6282-kube-api-access-zgnwc\") pod \"keystone-operator-index-qdvh5\" (UID: \"0528fe97-ef52-441d-9e22-bf89676e6282\") " pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.125174 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnwc\" (UniqueName: \"kubernetes.io/projected/0528fe97-ef52-441d-9e22-bf89676e6282-kube-api-access-zgnwc\") pod \"keystone-operator-index-qdvh5\" (UID: \"0528fe97-ef52-441d-9e22-bf89676e6282\") " pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.155612 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnwc\" (UniqueName: \"kubernetes.io/projected/0528fe97-ef52-441d-9e22-bf89676e6282-kube-api-access-zgnwc\") pod \"keystone-operator-index-qdvh5\" (UID: \"0528fe97-ef52-441d-9e22-bf89676e6282\") " pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.282351 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.407213 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-8gz5n" podUID="5bfd6817-7026-43f4-81e9-e364980f3f78" containerName="registry-server" containerID="cri-o://fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f" gracePeriod=2 Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.720492 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-qdvh5"] Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.803337 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8gz5n" Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.840123 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtzgd\" (UniqueName: \"kubernetes.io/projected/5bfd6817-7026-43f4-81e9-e364980f3f78-kube-api-access-rtzgd\") pod \"5bfd6817-7026-43f4-81e9-e364980f3f78\" (UID: \"5bfd6817-7026-43f4-81e9-e364980f3f78\") " Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.848210 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfd6817-7026-43f4-81e9-e364980f3f78-kube-api-access-rtzgd" (OuterVolumeSpecName: "kube-api-access-rtzgd") pod "5bfd6817-7026-43f4-81e9-e364980f3f78" (UID: "5bfd6817-7026-43f4-81e9-e364980f3f78"). InnerVolumeSpecName "kube-api-access-rtzgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:33:59 crc kubenswrapper[4806]: I0217 15:33:59.941774 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtzgd\" (UniqueName: \"kubernetes.io/projected/5bfd6817-7026-43f4-81e9-e364980f3f78-kube-api-access-rtzgd\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.418326 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-qdvh5" event={"ID":"0528fe97-ef52-441d-9e22-bf89676e6282","Type":"ContainerStarted","Data":"eabec80865af86eeb1d59c7e043db7b965a365c9a5602dff3b926e78f404d531"} Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.418392 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-qdvh5" event={"ID":"0528fe97-ef52-441d-9e22-bf89676e6282","Type":"ContainerStarted","Data":"e73dca8b76b647cdb0c812e9bc322db5e51035f75acc47283eec1a0e714f0744"} Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.426181 4806 generic.go:334] "Generic (PLEG): container finished" podID="5bfd6817-7026-43f4-81e9-e364980f3f78" containerID="fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f" exitCode=0 Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.426239 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-8gz5n" Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.426243 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8gz5n" event={"ID":"5bfd6817-7026-43f4-81e9-e364980f3f78","Type":"ContainerDied","Data":"fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f"} Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.426358 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-8gz5n" event={"ID":"5bfd6817-7026-43f4-81e9-e364980f3f78","Type":"ContainerDied","Data":"ac50a2b0886135c23fb1808a662b91f10a0050c7510a37001a118f314f968cef"} Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.426595 4806 scope.go:117] "RemoveContainer" containerID="fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f" Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.449229 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-qdvh5" podStartSLOduration=2.048047092 podStartE2EDuration="2.449157416s" podCreationTimestamp="2026-02-17 15:33:58 +0000 UTC" firstStartedPulling="2026-02-17 15:33:59.73310434 +0000 UTC m=+801.263734751" lastFinishedPulling="2026-02-17 15:34:00.134214634 +0000 UTC m=+801.664845075" observedRunningTime="2026-02-17 15:34:00.438079646 +0000 UTC m=+801.968710057" watchObservedRunningTime="2026-02-17 15:34:00.449157416 +0000 UTC m=+801.979787847" Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.462176 4806 scope.go:117] "RemoveContainer" containerID="fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f" Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.465131 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-8gz5n"] Feb 17 15:34:00 crc kubenswrapper[4806]: E0217 15:34:00.465682 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f\": container with ID starting with fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f not found: ID does not exist" containerID="fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f" Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.465750 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f"} err="failed to get container status \"fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f\": rpc error: code = NotFound desc = could not find container \"fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f\": container with ID starting with fb4ad783d94be9ef76b5b5d25290320f121ee0c8bdd699edc86a58921c57cb8f not found: ID does not exist" Feb 17 15:34:00 crc kubenswrapper[4806]: I0217 15:34:00.475220 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-8gz5n"] Feb 17 15:34:01 crc kubenswrapper[4806]: I0217 15:34:01.177938 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bfd6817-7026-43f4-81e9-e364980f3f78" path="/var/lib/kubelet/pods/5bfd6817-7026-43f4-81e9-e364980f3f78/volumes" Feb 17 15:34:04 crc kubenswrapper[4806]: I0217 15:34:04.788634 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:34:04 crc kubenswrapper[4806]: I0217 15:34:04.789158 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:34:07 crc kubenswrapper[4806]: I0217 15:34:07.491532 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4","Type":"ContainerStarted","Data":"b21921c88bb65835183aa0713d031c638d2d233f92e66c2461b812a4b72eff8f"} Feb 17 15:34:09 crc kubenswrapper[4806]: I0217 15:34:09.283168 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:34:09 crc kubenswrapper[4806]: I0217 15:34:09.283836 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:34:09 crc kubenswrapper[4806]: I0217 15:34:09.324141 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:34:09 crc kubenswrapper[4806]: I0217 15:34:09.552154 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-qdvh5" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.435577 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k"] Feb 17 15:34:13 crc kubenswrapper[4806]: E0217 15:34:13.436763 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfd6817-7026-43f4-81e9-e364980f3f78" containerName="registry-server" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.436786 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfd6817-7026-43f4-81e9-e364980f3f78" containerName="registry-server" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.437030 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfd6817-7026-43f4-81e9-e364980f3f78" containerName="registry-server" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.438608 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.441376 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5p4j2" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.447881 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k"] Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.553922 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-util\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.553996 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpmjh\" (UniqueName: \"kubernetes.io/projected/eebc9abb-adc9-47ae-a370-fccd9e91a4da-kube-api-access-mpmjh\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.554153 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-bundle\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.655363 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-bundle\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.655527 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-util\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.655624 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpmjh\" (UniqueName: \"kubernetes.io/projected/eebc9abb-adc9-47ae-a370-fccd9e91a4da-kube-api-access-mpmjh\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.656060 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-bundle\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.656284 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-util\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.690648 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpmjh\" (UniqueName: \"kubernetes.io/projected/eebc9abb-adc9-47ae-a370-fccd9e91a4da-kube-api-access-mpmjh\") pod \"414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:13 crc kubenswrapper[4806]: I0217 15:34:13.762132 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:14 crc kubenswrapper[4806]: I0217 15:34:14.236393 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k"] Feb 17 15:34:14 crc kubenswrapper[4806]: I0217 15:34:14.559040 4806 generic.go:334] "Generic (PLEG): container finished" podID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerID="0f1b4ce3ebf5a6f854a3f6218aad4add0f1c5a5f8c7be95ee9c45b6c5a29ac2c" exitCode=0 Feb 17 15:34:14 crc kubenswrapper[4806]: I0217 15:34:14.559526 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" event={"ID":"eebc9abb-adc9-47ae-a370-fccd9e91a4da","Type":"ContainerDied","Data":"0f1b4ce3ebf5a6f854a3f6218aad4add0f1c5a5f8c7be95ee9c45b6c5a29ac2c"} Feb 17 15:34:14 crc kubenswrapper[4806]: I0217 15:34:14.559584 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" event={"ID":"eebc9abb-adc9-47ae-a370-fccd9e91a4da","Type":"ContainerStarted","Data":"38695b228a80bd2754f06b1a35365ae46a1e1dd2a6428931b0b9603452e3e52c"} Feb 17 15:34:15 crc kubenswrapper[4806]: I0217 15:34:15.572841 4806 generic.go:334] "Generic (PLEG): container finished" podID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerID="69abe659e29509e066e80f0db83aa87ff73748a6ea17af935313eb5cce768eca" exitCode=0 Feb 17 15:34:15 crc kubenswrapper[4806]: I0217 15:34:15.573271 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" event={"ID":"eebc9abb-adc9-47ae-a370-fccd9e91a4da","Type":"ContainerDied","Data":"69abe659e29509e066e80f0db83aa87ff73748a6ea17af935313eb5cce768eca"} Feb 17 15:34:16 crc kubenswrapper[4806]: I0217 15:34:16.587602 4806 generic.go:334] "Generic (PLEG): container finished" podID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerID="4e24bb36d037048d1d0b3a65b89943f3ee0356a59e980442a0cca6a83cf179d2" exitCode=0 Feb 17 15:34:16 crc kubenswrapper[4806]: I0217 15:34:16.587662 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" event={"ID":"eebc9abb-adc9-47ae-a370-fccd9e91a4da","Type":"ContainerDied","Data":"4e24bb36d037048d1d0b3a65b89943f3ee0356a59e980442a0cca6a83cf179d2"} Feb 17 15:34:17 crc kubenswrapper[4806]: I0217 15:34:17.874397 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:17 crc kubenswrapper[4806]: I0217 15:34:17.956562 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-util\") pod \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " Feb 17 15:34:17 crc kubenswrapper[4806]: I0217 15:34:17.956619 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpmjh\" (UniqueName: \"kubernetes.io/projected/eebc9abb-adc9-47ae-a370-fccd9e91a4da-kube-api-access-mpmjh\") pod \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " Feb 17 15:34:17 crc kubenswrapper[4806]: I0217 15:34:17.956652 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-bundle\") pod \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\" (UID: \"eebc9abb-adc9-47ae-a370-fccd9e91a4da\") " Feb 17 15:34:17 crc kubenswrapper[4806]: I0217 15:34:17.958677 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-bundle" (OuterVolumeSpecName: "bundle") pod "eebc9abb-adc9-47ae-a370-fccd9e91a4da" (UID: "eebc9abb-adc9-47ae-a370-fccd9e91a4da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:34:17 crc kubenswrapper[4806]: I0217 15:34:17.973651 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eebc9abb-adc9-47ae-a370-fccd9e91a4da-kube-api-access-mpmjh" (OuterVolumeSpecName: "kube-api-access-mpmjh") pod "eebc9abb-adc9-47ae-a370-fccd9e91a4da" (UID: "eebc9abb-adc9-47ae-a370-fccd9e91a4da"). InnerVolumeSpecName "kube-api-access-mpmjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:34:17 crc kubenswrapper[4806]: I0217 15:34:17.985480 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-util" (OuterVolumeSpecName: "util") pod "eebc9abb-adc9-47ae-a370-fccd9e91a4da" (UID: "eebc9abb-adc9-47ae-a370-fccd9e91a4da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:34:18 crc kubenswrapper[4806]: I0217 15:34:18.058169 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:18 crc kubenswrapper[4806]: I0217 15:34:18.058201 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpmjh\" (UniqueName: \"kubernetes.io/projected/eebc9abb-adc9-47ae-a370-fccd9e91a4da-kube-api-access-mpmjh\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:18 crc kubenswrapper[4806]: I0217 15:34:18.058212 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eebc9abb-adc9-47ae-a370-fccd9e91a4da-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:18 crc kubenswrapper[4806]: I0217 15:34:18.606224 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" event={"ID":"eebc9abb-adc9-47ae-a370-fccd9e91a4da","Type":"ContainerDied","Data":"38695b228a80bd2754f06b1a35365ae46a1e1dd2a6428931b0b9603452e3e52c"} Feb 17 15:34:18 crc kubenswrapper[4806]: I0217 15:34:18.606288 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38695b228a80bd2754f06b1a35365ae46a1e1dd2a6428931b0b9603452e3e52c" Feb 17 15:34:18 crc kubenswrapper[4806]: I0217 15:34:18.606393 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.666067 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7"] Feb 17 15:34:30 crc kubenswrapper[4806]: E0217 15:34:30.667108 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerName="util" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.667128 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerName="util" Feb 17 15:34:30 crc kubenswrapper[4806]: E0217 15:34:30.667139 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerName="pull" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.667146 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerName="pull" Feb 17 15:34:30 crc kubenswrapper[4806]: E0217 15:34:30.667161 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerName="extract" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.667167 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerName="extract" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.667277 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="eebc9abb-adc9-47ae-a370-fccd9e91a4da" containerName="extract" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.667854 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.673668 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-kdmwv" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.673687 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.681893 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7"] Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.778932 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zmmn\" (UniqueName: \"kubernetes.io/projected/ac32dab0-e793-4cbc-b363-a98a142aec89-kube-api-access-8zmmn\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.779036 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac32dab0-e793-4cbc-b363-a98a142aec89-webhook-cert\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.779063 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac32dab0-e793-4cbc-b363-a98a142aec89-apiservice-cert\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.880561 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac32dab0-e793-4cbc-b363-a98a142aec89-webhook-cert\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.880634 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac32dab0-e793-4cbc-b363-a98a142aec89-apiservice-cert\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.880769 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zmmn\" (UniqueName: \"kubernetes.io/projected/ac32dab0-e793-4cbc-b363-a98a142aec89-kube-api-access-8zmmn\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.889007 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ac32dab0-e793-4cbc-b363-a98a142aec89-apiservice-cert\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.889091 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ac32dab0-e793-4cbc-b363-a98a142aec89-webhook-cert\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.900678 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zmmn\" (UniqueName: \"kubernetes.io/projected/ac32dab0-e793-4cbc-b363-a98a142aec89-kube-api-access-8zmmn\") pod \"keystone-operator-controller-manager-568c5665fb-9wsl7\" (UID: \"ac32dab0-e793-4cbc-b363-a98a142aec89\") " pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:30 crc kubenswrapper[4806]: I0217 15:34:30.990732 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:31 crc kubenswrapper[4806]: I0217 15:34:31.547724 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7"] Feb 17 15:34:31 crc kubenswrapper[4806]: W0217 15:34:31.554159 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac32dab0_e793_4cbc_b363_a98a142aec89.slice/crio-af5b18472d82645dd1d0894af636a7f719daeb9f51ee7b83a6060c7bf01d2951 WatchSource:0}: Error finding container af5b18472d82645dd1d0894af636a7f719daeb9f51ee7b83a6060c7bf01d2951: Status 404 returned error can't find the container with id af5b18472d82645dd1d0894af636a7f719daeb9f51ee7b83a6060c7bf01d2951 Feb 17 15:34:31 crc kubenswrapper[4806]: I0217 15:34:31.703349 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" event={"ID":"ac32dab0-e793-4cbc-b363-a98a142aec89","Type":"ContainerStarted","Data":"af5b18472d82645dd1d0894af636a7f719daeb9f51ee7b83a6060c7bf01d2951"} Feb 17 15:34:34 crc kubenswrapper[4806]: I0217 15:34:34.784797 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:34:34 crc kubenswrapper[4806]: I0217 15:34:34.785358 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:34:34 crc kubenswrapper[4806]: I0217 15:34:34.785432 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:34:34 crc kubenswrapper[4806]: I0217 15:34:34.785960 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1045b513b3ae59cf6bef863ffd1792d3b68fa1978b67b7c641086276ee981395"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:34:34 crc kubenswrapper[4806]: I0217 15:34:34.786027 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://1045b513b3ae59cf6bef863ffd1792d3b68fa1978b67b7c641086276ee981395" gracePeriod=600 Feb 17 15:34:35 crc kubenswrapper[4806]: I0217 15:34:35.733136 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="1045b513b3ae59cf6bef863ffd1792d3b68fa1978b67b7c641086276ee981395" exitCode=0 Feb 17 15:34:35 crc kubenswrapper[4806]: I0217 15:34:35.733332 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"1045b513b3ae59cf6bef863ffd1792d3b68fa1978b67b7c641086276ee981395"} Feb 17 15:34:35 crc kubenswrapper[4806]: I0217 15:34:35.734125 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"b16af821a4d64d3667cdbb96cac9edee38709ffe5e3eff30ff59770488d6700a"} Feb 17 15:34:35 crc kubenswrapper[4806]: I0217 15:34:35.734154 4806 scope.go:117] "RemoveContainer" containerID="3c9f6cff3a70a1759104fc9fce3a1e5e0b42b5eabe6508edc074aa83565b2162" Feb 17 15:34:35 crc kubenswrapper[4806]: I0217 15:34:35.736512 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" event={"ID":"ac32dab0-e793-4cbc-b363-a98a142aec89","Type":"ContainerStarted","Data":"3ed799fcb431e5c363c90bb04d2dc850a2a8c0e874c8c141d7e43b597b767ab8"} Feb 17 15:34:35 crc kubenswrapper[4806]: I0217 15:34:35.736757 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:35 crc kubenswrapper[4806]: I0217 15:34:35.775545 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" podStartSLOduration=2.334320097 podStartE2EDuration="5.775516698s" podCreationTimestamp="2026-02-17 15:34:30 +0000 UTC" firstStartedPulling="2026-02-17 15:34:31.557389459 +0000 UTC m=+833.088019880" lastFinishedPulling="2026-02-17 15:34:34.99858606 +0000 UTC m=+836.529216481" observedRunningTime="2026-02-17 15:34:35.771216842 +0000 UTC m=+837.301847273" watchObservedRunningTime="2026-02-17 15:34:35.775516698 +0000 UTC m=+837.306147119" Feb 17 15:34:39 crc kubenswrapper[4806]: I0217 15:34:39.768230 4806 generic.go:334] "Generic (PLEG): container finished" podID="e00f765b-c8c1-44b2-ad4a-9c17876f7ab4" containerID="b21921c88bb65835183aa0713d031c638d2d233f92e66c2461b812a4b72eff8f" exitCode=0 Feb 17 15:34:39 crc kubenswrapper[4806]: I0217 15:34:39.768316 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4","Type":"ContainerDied","Data":"b21921c88bb65835183aa0713d031c638d2d233f92e66c2461b812a4b72eff8f"} Feb 17 15:34:40 crc kubenswrapper[4806]: I0217 15:34:40.782036 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"e00f765b-c8c1-44b2-ad4a-9c17876f7ab4","Type":"ContainerStarted","Data":"cb8a433c3cc71300e34b28fe39882af4d722751dee3973c3eac64f1397d6e695"} Feb 17 15:34:40 crc kubenswrapper[4806]: I0217 15:34:40.783191 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:34:40 crc kubenswrapper[4806]: I0217 15:34:40.819005 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.176132712 podStartE2EDuration="49.818772542s" podCreationTimestamp="2026-02-17 15:33:51 +0000 UTC" firstStartedPulling="2026-02-17 15:33:53.576980004 +0000 UTC m=+795.107610415" lastFinishedPulling="2026-02-17 15:34:06.219619834 +0000 UTC m=+807.750250245" observedRunningTime="2026-02-17 15:34:40.808601621 +0000 UTC m=+842.339232082" watchObservedRunningTime="2026-02-17 15:34:40.818772542 +0000 UTC m=+842.349402973" Feb 17 15:34:40 crc kubenswrapper[4806]: I0217 15:34:40.995726 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-568c5665fb-9wsl7" Feb 17 15:34:41 crc kubenswrapper[4806]: I0217 15:34:41.992868 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-5pkxg"] Feb 17 15:34:41 crc kubenswrapper[4806]: I0217 15:34:41.993948 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:41.999970 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-11a8-account-create-update-7chcm"] Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.000887 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.002529 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.005908 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-5pkxg"] Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.016152 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-11a8-account-create-update-7chcm"] Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.062571 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fcsx\" (UniqueName: \"kubernetes.io/projected/e03ca136-92c9-4afb-8044-3ddd06b0fd24-kube-api-access-6fcsx\") pod \"keystone-db-create-5pkxg\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.062795 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03ca136-92c9-4afb-8044-3ddd06b0fd24-operator-scripts\") pod \"keystone-db-create-5pkxg\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.164479 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsnnp\" (UniqueName: \"kubernetes.io/projected/baa53f26-bda2-425f-a3be-e25f841cd4ed-kube-api-access-bsnnp\") pod \"keystone-11a8-account-create-update-7chcm\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.164594 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03ca136-92c9-4afb-8044-3ddd06b0fd24-operator-scripts\") pod \"keystone-db-create-5pkxg\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.164639 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa53f26-bda2-425f-a3be-e25f841cd4ed-operator-scripts\") pod \"keystone-11a8-account-create-update-7chcm\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.164664 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fcsx\" (UniqueName: \"kubernetes.io/projected/e03ca136-92c9-4afb-8044-3ddd06b0fd24-kube-api-access-6fcsx\") pod \"keystone-db-create-5pkxg\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.166472 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03ca136-92c9-4afb-8044-3ddd06b0fd24-operator-scripts\") pod \"keystone-db-create-5pkxg\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.186164 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fcsx\" (UniqueName: \"kubernetes.io/projected/e03ca136-92c9-4afb-8044-3ddd06b0fd24-kube-api-access-6fcsx\") pod \"keystone-db-create-5pkxg\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.266025 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa53f26-bda2-425f-a3be-e25f841cd4ed-operator-scripts\") pod \"keystone-11a8-account-create-update-7chcm\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.267268 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa53f26-bda2-425f-a3be-e25f841cd4ed-operator-scripts\") pod \"keystone-11a8-account-create-update-7chcm\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.267509 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsnnp\" (UniqueName: \"kubernetes.io/projected/baa53f26-bda2-425f-a3be-e25f841cd4ed-kube-api-access-bsnnp\") pod \"keystone-11a8-account-create-update-7chcm\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.301726 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsnnp\" (UniqueName: \"kubernetes.io/projected/baa53f26-bda2-425f-a3be-e25f841cd4ed-kube-api-access-bsnnp\") pod \"keystone-11a8-account-create-update-7chcm\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.324800 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.350089 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.846881 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-11a8-account-create-update-7chcm"] Feb 17 15:34:42 crc kubenswrapper[4806]: W0217 15:34:42.859597 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaa53f26_bda2_425f_a3be_e25f841cd4ed.slice/crio-d0af4654ba9af9a60f1b5944d6dd177a1ae61486333ee337663a03c36112a891 WatchSource:0}: Error finding container d0af4654ba9af9a60f1b5944d6dd177a1ae61486333ee337663a03c36112a891: Status 404 returned error can't find the container with id d0af4654ba9af9a60f1b5944d6dd177a1ae61486333ee337663a03c36112a891 Feb 17 15:34:42 crc kubenswrapper[4806]: I0217 15:34:42.940873 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-5pkxg"] Feb 17 15:34:42 crc kubenswrapper[4806]: W0217 15:34:42.984766 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode03ca136_92c9_4afb_8044_3ddd06b0fd24.slice/crio-d2a17ed45cb9534065a47ce339752c0147e26bcbbd634c0c2b59e4459945a9b1 WatchSource:0}: Error finding container d2a17ed45cb9534065a47ce339752c0147e26bcbbd634c0c2b59e4459945a9b1: Status 404 returned error can't find the container with id d2a17ed45cb9534065a47ce339752c0147e26bcbbd634c0c2b59e4459945a9b1 Feb 17 15:34:43 crc kubenswrapper[4806]: I0217 15:34:43.805775 4806 generic.go:334] "Generic (PLEG): container finished" podID="baa53f26-bda2-425f-a3be-e25f841cd4ed" containerID="5d38decaa86b5cf9fbc7970243673df167ac3a039478b355704c6221445c084f" exitCode=0 Feb 17 15:34:43 crc kubenswrapper[4806]: I0217 15:34:43.805861 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" event={"ID":"baa53f26-bda2-425f-a3be-e25f841cd4ed","Type":"ContainerDied","Data":"5d38decaa86b5cf9fbc7970243673df167ac3a039478b355704c6221445c084f"} Feb 17 15:34:43 crc kubenswrapper[4806]: I0217 15:34:43.805899 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" event={"ID":"baa53f26-bda2-425f-a3be-e25f841cd4ed","Type":"ContainerStarted","Data":"d0af4654ba9af9a60f1b5944d6dd177a1ae61486333ee337663a03c36112a891"} Feb 17 15:34:43 crc kubenswrapper[4806]: I0217 15:34:43.807995 4806 generic.go:334] "Generic (PLEG): container finished" podID="e03ca136-92c9-4afb-8044-3ddd06b0fd24" containerID="0c62b706621968e83c330da71613e36b3d487bf89e9cc2f0d544e94e1e77839e" exitCode=0 Feb 17 15:34:43 crc kubenswrapper[4806]: I0217 15:34:43.808044 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-5pkxg" event={"ID":"e03ca136-92c9-4afb-8044-3ddd06b0fd24","Type":"ContainerDied","Data":"0c62b706621968e83c330da71613e36b3d487bf89e9cc2f0d544e94e1e77839e"} Feb 17 15:34:43 crc kubenswrapper[4806]: I0217 15:34:43.808077 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-5pkxg" event={"ID":"e03ca136-92c9-4afb-8044-3ddd06b0fd24","Type":"ContainerStarted","Data":"d2a17ed45cb9534065a47ce339752c0147e26bcbbd634c0c2b59e4459945a9b1"} Feb 17 15:34:44 crc kubenswrapper[4806]: I0217 15:34:44.768038 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-ntnhc"] Feb 17 15:34:44 crc kubenswrapper[4806]: I0217 15:34:44.769470 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-ntnhc" Feb 17 15:34:44 crc kubenswrapper[4806]: I0217 15:34:44.771544 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-7p4hp" Feb 17 15:34:44 crc kubenswrapper[4806]: I0217 15:34:44.788911 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-ntnhc"] Feb 17 15:34:44 crc kubenswrapper[4806]: I0217 15:34:44.908101 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9r6w\" (UniqueName: \"kubernetes.io/projected/957a81d8-d3e5-4362-9b1e-eaf4617f6d90-kube-api-access-m9r6w\") pod \"horizon-operator-index-ntnhc\" (UID: \"957a81d8-d3e5-4362-9b1e-eaf4617f6d90\") " pod="openstack-operators/horizon-operator-index-ntnhc" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.008996 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9r6w\" (UniqueName: \"kubernetes.io/projected/957a81d8-d3e5-4362-9b1e-eaf4617f6d90-kube-api-access-m9r6w\") pod \"horizon-operator-index-ntnhc\" (UID: \"957a81d8-d3e5-4362-9b1e-eaf4617f6d90\") " pod="openstack-operators/horizon-operator-index-ntnhc" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.031417 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9r6w\" (UniqueName: \"kubernetes.io/projected/957a81d8-d3e5-4362-9b1e-eaf4617f6d90-kube-api-access-m9r6w\") pod \"horizon-operator-index-ntnhc\" (UID: \"957a81d8-d3e5-4362-9b1e-eaf4617f6d90\") " pod="openstack-operators/horizon-operator-index-ntnhc" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.086906 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-ntnhc" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.313575 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.330108 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.450048 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsnnp\" (UniqueName: \"kubernetes.io/projected/baa53f26-bda2-425f-a3be-e25f841cd4ed-kube-api-access-bsnnp\") pod \"baa53f26-bda2-425f-a3be-e25f841cd4ed\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.450495 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03ca136-92c9-4afb-8044-3ddd06b0fd24-operator-scripts\") pod \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.450541 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fcsx\" (UniqueName: \"kubernetes.io/projected/e03ca136-92c9-4afb-8044-3ddd06b0fd24-kube-api-access-6fcsx\") pod \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\" (UID: \"e03ca136-92c9-4afb-8044-3ddd06b0fd24\") " Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.450607 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa53f26-bda2-425f-a3be-e25f841cd4ed-operator-scripts\") pod \"baa53f26-bda2-425f-a3be-e25f841cd4ed\" (UID: \"baa53f26-bda2-425f-a3be-e25f841cd4ed\") " Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.451146 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e03ca136-92c9-4afb-8044-3ddd06b0fd24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e03ca136-92c9-4afb-8044-3ddd06b0fd24" (UID: "e03ca136-92c9-4afb-8044-3ddd06b0fd24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.451241 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa53f26-bda2-425f-a3be-e25f841cd4ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "baa53f26-bda2-425f-a3be-e25f841cd4ed" (UID: "baa53f26-bda2-425f-a3be-e25f841cd4ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.451453 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e03ca136-92c9-4afb-8044-3ddd06b0fd24-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.451492 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa53f26-bda2-425f-a3be-e25f841cd4ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.456254 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03ca136-92c9-4afb-8044-3ddd06b0fd24-kube-api-access-6fcsx" (OuterVolumeSpecName: "kube-api-access-6fcsx") pod "e03ca136-92c9-4afb-8044-3ddd06b0fd24" (UID: "e03ca136-92c9-4afb-8044-3ddd06b0fd24"). InnerVolumeSpecName "kube-api-access-6fcsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.456460 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa53f26-bda2-425f-a3be-e25f841cd4ed-kube-api-access-bsnnp" (OuterVolumeSpecName: "kube-api-access-bsnnp") pod "baa53f26-bda2-425f-a3be-e25f841cd4ed" (UID: "baa53f26-bda2-425f-a3be-e25f841cd4ed"). InnerVolumeSpecName "kube-api-access-bsnnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.553244 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsnnp\" (UniqueName: \"kubernetes.io/projected/baa53f26-bda2-425f-a3be-e25f841cd4ed-kube-api-access-bsnnp\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.553285 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fcsx\" (UniqueName: \"kubernetes.io/projected/e03ca136-92c9-4afb-8044-3ddd06b0fd24-kube-api-access-6fcsx\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.617535 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-ntnhc"] Feb 17 15:34:45 crc kubenswrapper[4806]: W0217 15:34:45.623543 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod957a81d8_d3e5_4362_9b1e_eaf4617f6d90.slice/crio-b04969b53023e64ed8858ca281ec033606a1cda1792084ff04984a4dde53674a WatchSource:0}: Error finding container b04969b53023e64ed8858ca281ec033606a1cda1792084ff04984a4dde53674a: Status 404 returned error can't find the container with id b04969b53023e64ed8858ca281ec033606a1cda1792084ff04984a4dde53674a Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.820999 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.820968 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-11a8-account-create-update-7chcm" event={"ID":"baa53f26-bda2-425f-a3be-e25f841cd4ed","Type":"ContainerDied","Data":"d0af4654ba9af9a60f1b5944d6dd177a1ae61486333ee337663a03c36112a891"} Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.821204 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0af4654ba9af9a60f1b5944d6dd177a1ae61486333ee337663a03c36112a891" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.822946 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-5pkxg" event={"ID":"e03ca136-92c9-4afb-8044-3ddd06b0fd24","Type":"ContainerDied","Data":"d2a17ed45cb9534065a47ce339752c0147e26bcbbd634c0c2b59e4459945a9b1"} Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.822980 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2a17ed45cb9534065a47ce339752c0147e26bcbbd634c0c2b59e4459945a9b1" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.823011 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-5pkxg" Feb 17 15:34:45 crc kubenswrapper[4806]: I0217 15:34:45.824271 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-ntnhc" event={"ID":"957a81d8-d3e5-4362-9b1e-eaf4617f6d90","Type":"ContainerStarted","Data":"b04969b53023e64ed8858ca281ec033606a1cda1792084ff04984a4dde53674a"} Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.561916 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-kdftz"] Feb 17 15:34:48 crc kubenswrapper[4806]: E0217 15:34:48.563068 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa53f26-bda2-425f-a3be-e25f841cd4ed" containerName="mariadb-account-create-update" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.563091 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa53f26-bda2-425f-a3be-e25f841cd4ed" containerName="mariadb-account-create-update" Feb 17 15:34:48 crc kubenswrapper[4806]: E0217 15:34:48.563138 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03ca136-92c9-4afb-8044-3ddd06b0fd24" containerName="mariadb-database-create" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.563151 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03ca136-92c9-4afb-8044-3ddd06b0fd24" containerName="mariadb-database-create" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.563341 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03ca136-92c9-4afb-8044-3ddd06b0fd24" containerName="mariadb-database-create" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.563360 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa53f26-bda2-425f-a3be-e25f841cd4ed" containerName="mariadb-account-create-update" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.563990 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.566147 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-g54sw" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.571352 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-kdftz"] Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.711235 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvlt\" (UniqueName: \"kubernetes.io/projected/4eb347b9-421b-4c66-97d5-1649602d2dd6-kube-api-access-jbvlt\") pod \"swift-operator-index-kdftz\" (UID: \"4eb347b9-421b-4c66-97d5-1649602d2dd6\") " pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.813286 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvlt\" (UniqueName: \"kubernetes.io/projected/4eb347b9-421b-4c66-97d5-1649602d2dd6-kube-api-access-jbvlt\") pod \"swift-operator-index-kdftz\" (UID: \"4eb347b9-421b-4c66-97d5-1649602d2dd6\") " pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.840214 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvlt\" (UniqueName: \"kubernetes.io/projected/4eb347b9-421b-4c66-97d5-1649602d2dd6-kube-api-access-jbvlt\") pod \"swift-operator-index-kdftz\" (UID: \"4eb347b9-421b-4c66-97d5-1649602d2dd6\") " pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.848564 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-ntnhc" event={"ID":"957a81d8-d3e5-4362-9b1e-eaf4617f6d90","Type":"ContainerStarted","Data":"35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6"} Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.869593 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-ntnhc" podStartSLOduration=2.683222035 podStartE2EDuration="4.869565943s" podCreationTimestamp="2026-02-17 15:34:44 +0000 UTC" firstStartedPulling="2026-02-17 15:34:45.625752695 +0000 UTC m=+847.156383106" lastFinishedPulling="2026-02-17 15:34:47.812096593 +0000 UTC m=+849.342727014" observedRunningTime="2026-02-17 15:34:48.861811451 +0000 UTC m=+850.392441892" watchObservedRunningTime="2026-02-17 15:34:48.869565943 +0000 UTC m=+850.400196384" Feb 17 15:34:48 crc kubenswrapper[4806]: I0217 15:34:48.930135 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:49 crc kubenswrapper[4806]: I0217 15:34:49.379945 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-kdftz"] Feb 17 15:34:49 crc kubenswrapper[4806]: I0217 15:34:49.858373 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-kdftz" event={"ID":"4eb347b9-421b-4c66-97d5-1649602d2dd6","Type":"ContainerStarted","Data":"00e601fcd64bb4638c9fb8dfddd8b092e01abe85c7198bb85b8487e69f3e7373"} Feb 17 15:34:50 crc kubenswrapper[4806]: I0217 15:34:50.556134 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-ntnhc"] Feb 17 15:34:50 crc kubenswrapper[4806]: I0217 15:34:50.866371 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/horizon-operator-index-ntnhc" podUID="957a81d8-d3e5-4362-9b1e-eaf4617f6d90" containerName="registry-server" containerID="cri-o://35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6" gracePeriod=2 Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.363211 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-9b5kl"] Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.364527 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.373671 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-9b5kl"] Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.375799 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-ntnhc" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.556176 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9r6w\" (UniqueName: \"kubernetes.io/projected/957a81d8-d3e5-4362-9b1e-eaf4617f6d90-kube-api-access-m9r6w\") pod \"957a81d8-d3e5-4362-9b1e-eaf4617f6d90\" (UID: \"957a81d8-d3e5-4362-9b1e-eaf4617f6d90\") " Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.556489 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9gdm\" (UniqueName: \"kubernetes.io/projected/4da12153-8e2b-42f8-b498-15f3035a3769-kube-api-access-d9gdm\") pod \"horizon-operator-index-9b5kl\" (UID: \"4da12153-8e2b-42f8-b498-15f3035a3769\") " pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.569877 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957a81d8-d3e5-4362-9b1e-eaf4617f6d90-kube-api-access-m9r6w" (OuterVolumeSpecName: "kube-api-access-m9r6w") pod "957a81d8-d3e5-4362-9b1e-eaf4617f6d90" (UID: "957a81d8-d3e5-4362-9b1e-eaf4617f6d90"). InnerVolumeSpecName "kube-api-access-m9r6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.657991 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9gdm\" (UniqueName: \"kubernetes.io/projected/4da12153-8e2b-42f8-b498-15f3035a3769-kube-api-access-d9gdm\") pod \"horizon-operator-index-9b5kl\" (UID: \"4da12153-8e2b-42f8-b498-15f3035a3769\") " pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.658505 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9r6w\" (UniqueName: \"kubernetes.io/projected/957a81d8-d3e5-4362-9b1e-eaf4617f6d90-kube-api-access-m9r6w\") on node \"crc\" DevicePath \"\"" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.676606 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9gdm\" (UniqueName: \"kubernetes.io/projected/4da12153-8e2b-42f8-b498-15f3035a3769-kube-api-access-d9gdm\") pod \"horizon-operator-index-9b5kl\" (UID: \"4da12153-8e2b-42f8-b498-15f3035a3769\") " pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.692299 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.877623 4806 generic.go:334] "Generic (PLEG): container finished" podID="957a81d8-d3e5-4362-9b1e-eaf4617f6d90" containerID="35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6" exitCode=0 Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.877678 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-ntnhc" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.877721 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-ntnhc" event={"ID":"957a81d8-d3e5-4362-9b1e-eaf4617f6d90","Type":"ContainerDied","Data":"35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6"} Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.877766 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-ntnhc" event={"ID":"957a81d8-d3e5-4362-9b1e-eaf4617f6d90","Type":"ContainerDied","Data":"b04969b53023e64ed8858ca281ec033606a1cda1792084ff04984a4dde53674a"} Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.877794 4806 scope.go:117] "RemoveContainer" containerID="35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.887450 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-kdftz" event={"ID":"4eb347b9-421b-4c66-97d5-1649602d2dd6","Type":"ContainerStarted","Data":"f02a67bad98b9222371d419a781eb7e434f824a6440dac1a030cde27afd0e573"} Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.922257 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-kdftz" podStartSLOduration=2.2556066550000002 podStartE2EDuration="3.922230325s" podCreationTimestamp="2026-02-17 15:34:48 +0000 UTC" firstStartedPulling="2026-02-17 15:34:49.394832407 +0000 UTC m=+850.925462818" lastFinishedPulling="2026-02-17 15:34:51.061456077 +0000 UTC m=+852.592086488" observedRunningTime="2026-02-17 15:34:51.915792006 +0000 UTC m=+853.446422467" watchObservedRunningTime="2026-02-17 15:34:51.922230325 +0000 UTC m=+853.452860766" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.927276 4806 scope.go:117] "RemoveContainer" containerID="35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6" Feb 17 15:34:51 crc kubenswrapper[4806]: E0217 15:34:51.929111 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6\": container with ID starting with 35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6 not found: ID does not exist" containerID="35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.929167 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6"} err="failed to get container status \"35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6\": rpc error: code = NotFound desc = could not find container \"35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6\": container with ID starting with 35d67cb0471b3af30d031025f74b93738895809203ffb27acb2d4307a52db1a6 not found: ID does not exist" Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.938047 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/horizon-operator-index-ntnhc"] Feb 17 15:34:51 crc kubenswrapper[4806]: I0217 15:34:51.949863 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/horizon-operator-index-ntnhc"] Feb 17 15:34:52 crc kubenswrapper[4806]: I0217 15:34:52.213184 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-9b5kl"] Feb 17 15:34:52 crc kubenswrapper[4806]: I0217 15:34:52.895736 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-9b5kl" event={"ID":"4da12153-8e2b-42f8-b498-15f3035a3769","Type":"ContainerStarted","Data":"8836232fcb38ebcbb6f5e668ddb8100685ee3ba7420f9b851995b4184dbdd5a7"} Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.096704 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.173394 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957a81d8-d3e5-4362-9b1e-eaf4617f6d90" path="/var/lib/kubelet/pods/957a81d8-d3e5-4362-9b1e-eaf4617f6d90/volumes" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.643920 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-nzrjr"] Feb 17 15:34:53 crc kubenswrapper[4806]: E0217 15:34:53.644262 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957a81d8-d3e5-4362-9b1e-eaf4617f6d90" containerName="registry-server" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.644293 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="957a81d8-d3e5-4362-9b1e-eaf4617f6d90" containerName="registry-server" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.644465 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="957a81d8-d3e5-4362-9b1e-eaf4617f6d90" containerName="registry-server" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.645001 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.648196 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.648197 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.649253 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.650070 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-zwwkb" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.661835 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-nzrjr"] Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.797026 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abc5787-64ef-4761-ab4e-38aec08f2c1b-config-data\") pod \"keystone-db-sync-nzrjr\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.797102 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2pnb\" (UniqueName: \"kubernetes.io/projected/2abc5787-64ef-4761-ab4e-38aec08f2c1b-kube-api-access-z2pnb\") pod \"keystone-db-sync-nzrjr\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.898063 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abc5787-64ef-4761-ab4e-38aec08f2c1b-config-data\") pod \"keystone-db-sync-nzrjr\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.899517 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2pnb\" (UniqueName: \"kubernetes.io/projected/2abc5787-64ef-4761-ab4e-38aec08f2c1b-kube-api-access-z2pnb\") pod \"keystone-db-sync-nzrjr\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.906792 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abc5787-64ef-4761-ab4e-38aec08f2c1b-config-data\") pod \"keystone-db-sync-nzrjr\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.909109 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-9b5kl" event={"ID":"4da12153-8e2b-42f8-b498-15f3035a3769","Type":"ContainerStarted","Data":"1169b21f41882d0f8ec20988cead12c8af5c869ea5f04d92d20491888f85a59d"} Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.923813 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2pnb\" (UniqueName: \"kubernetes.io/projected/2abc5787-64ef-4761-ab4e-38aec08f2c1b-kube-api-access-z2pnb\") pod \"keystone-db-sync-nzrjr\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.932033 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-9b5kl" podStartSLOduration=2.4462472699999998 podStartE2EDuration="2.93200842s" podCreationTimestamp="2026-02-17 15:34:51 +0000 UTC" firstStartedPulling="2026-02-17 15:34:52.233256485 +0000 UTC m=+853.763886936" lastFinishedPulling="2026-02-17 15:34:52.719017625 +0000 UTC m=+854.249648086" observedRunningTime="2026-02-17 15:34:53.930771429 +0000 UTC m=+855.461401860" watchObservedRunningTime="2026-02-17 15:34:53.93200842 +0000 UTC m=+855.462638861" Feb 17 15:34:53 crc kubenswrapper[4806]: I0217 15:34:53.987172 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:34:54 crc kubenswrapper[4806]: I0217 15:34:54.417853 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-nzrjr"] Feb 17 15:34:54 crc kubenswrapper[4806]: W0217 15:34:54.423875 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2abc5787_64ef_4761_ab4e_38aec08f2c1b.slice/crio-4ccd040f8469025c1525b20a2880319d0302ea8cff21c012e783214917fbc7bb WatchSource:0}: Error finding container 4ccd040f8469025c1525b20a2880319d0302ea8cff21c012e783214917fbc7bb: Status 404 returned error can't find the container with id 4ccd040f8469025c1525b20a2880319d0302ea8cff21c012e783214917fbc7bb Feb 17 15:34:54 crc kubenswrapper[4806]: I0217 15:34:54.916628 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" event={"ID":"2abc5787-64ef-4761-ab4e-38aec08f2c1b","Type":"ContainerStarted","Data":"4ccd040f8469025c1525b20a2880319d0302ea8cff21c012e783214917fbc7bb"} Feb 17 15:34:58 crc kubenswrapper[4806]: I0217 15:34:58.930665 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:58 crc kubenswrapper[4806]: I0217 15:34:58.931250 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:58 crc kubenswrapper[4806]: I0217 15:34:58.957977 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:34:58 crc kubenswrapper[4806]: I0217 15:34:58.995791 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-kdftz" Feb 17 15:35:01 crc kubenswrapper[4806]: I0217 15:35:01.692960 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:35:01 crc kubenswrapper[4806]: I0217 15:35:01.693618 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:35:01 crc kubenswrapper[4806]: I0217 15:35:01.719499 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:35:01 crc kubenswrapper[4806]: I0217 15:35:01.967659 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" event={"ID":"2abc5787-64ef-4761-ab4e-38aec08f2c1b","Type":"ContainerStarted","Data":"7403457696ed16220199b64b0bbf9863f3e1635cb48f2f35e467b3ea6d347b4b"} Feb 17 15:35:01 crc kubenswrapper[4806]: I0217 15:35:01.992682 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" podStartSLOduration=2.484711912 podStartE2EDuration="8.992661708s" podCreationTimestamp="2026-02-17 15:34:53 +0000 UTC" firstStartedPulling="2026-02-17 15:34:54.426467874 +0000 UTC m=+855.957098285" lastFinishedPulling="2026-02-17 15:35:00.93441765 +0000 UTC m=+862.465048081" observedRunningTime="2026-02-17 15:35:01.992197616 +0000 UTC m=+863.522828047" watchObservedRunningTime="2026-02-17 15:35:01.992661708 +0000 UTC m=+863.523292119" Feb 17 15:35:02 crc kubenswrapper[4806]: I0217 15:35:02.013136 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-9b5kl" Feb 17 15:35:04 crc kubenswrapper[4806]: I0217 15:35:04.994337 4806 generic.go:334] "Generic (PLEG): container finished" podID="2abc5787-64ef-4761-ab4e-38aec08f2c1b" containerID="7403457696ed16220199b64b0bbf9863f3e1635cb48f2f35e467b3ea6d347b4b" exitCode=0 Feb 17 15:35:04 crc kubenswrapper[4806]: I0217 15:35:04.994465 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" event={"ID":"2abc5787-64ef-4761-ab4e-38aec08f2c1b","Type":"ContainerDied","Data":"7403457696ed16220199b64b0bbf9863f3e1635cb48f2f35e467b3ea6d347b4b"} Feb 17 15:35:06 crc kubenswrapper[4806]: I0217 15:35:06.349658 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:35:06 crc kubenswrapper[4806]: I0217 15:35:06.524669 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abc5787-64ef-4761-ab4e-38aec08f2c1b-config-data\") pod \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " Feb 17 15:35:06 crc kubenswrapper[4806]: I0217 15:35:06.524826 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2pnb\" (UniqueName: \"kubernetes.io/projected/2abc5787-64ef-4761-ab4e-38aec08f2c1b-kube-api-access-z2pnb\") pod \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\" (UID: \"2abc5787-64ef-4761-ab4e-38aec08f2c1b\") " Feb 17 15:35:06 crc kubenswrapper[4806]: I0217 15:35:06.540847 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abc5787-64ef-4761-ab4e-38aec08f2c1b-kube-api-access-z2pnb" (OuterVolumeSpecName: "kube-api-access-z2pnb") pod "2abc5787-64ef-4761-ab4e-38aec08f2c1b" (UID: "2abc5787-64ef-4761-ab4e-38aec08f2c1b"). InnerVolumeSpecName "kube-api-access-z2pnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:35:06 crc kubenswrapper[4806]: I0217 15:35:06.572938 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abc5787-64ef-4761-ab4e-38aec08f2c1b-config-data" (OuterVolumeSpecName: "config-data") pod "2abc5787-64ef-4761-ab4e-38aec08f2c1b" (UID: "2abc5787-64ef-4761-ab4e-38aec08f2c1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:06 crc kubenswrapper[4806]: I0217 15:35:06.627379 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2pnb\" (UniqueName: \"kubernetes.io/projected/2abc5787-64ef-4761-ab4e-38aec08f2c1b-kube-api-access-z2pnb\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:06 crc kubenswrapper[4806]: I0217 15:35:06.627468 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2abc5787-64ef-4761-ab4e-38aec08f2c1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.012208 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" event={"ID":"2abc5787-64ef-4761-ab4e-38aec08f2c1b","Type":"ContainerDied","Data":"4ccd040f8469025c1525b20a2880319d0302ea8cff21c012e783214917fbc7bb"} Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.012272 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ccd040f8469025c1525b20a2880319d0302ea8cff21c012e783214917fbc7bb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.012278 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-nzrjr" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.229552 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-k8dwb"] Feb 17 15:35:07 crc kubenswrapper[4806]: E0217 15:35:07.229950 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abc5787-64ef-4761-ab4e-38aec08f2c1b" containerName="keystone-db-sync" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.229983 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abc5787-64ef-4761-ab4e-38aec08f2c1b" containerName="keystone-db-sync" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.230219 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abc5787-64ef-4761-ab4e-38aec08f2c1b" containerName="keystone-db-sync" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.230982 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.234245 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.234745 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.235051 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.235292 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-zwwkb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.249784 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.248396 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-k8dwb"] Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.352449 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqwn8\" (UniqueName: \"kubernetes.io/projected/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-kube-api-access-zqwn8\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.352535 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-config-data\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.352843 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-credential-keys\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.352926 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-scripts\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.353030 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-fernet-keys\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.454196 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-config-data\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.454281 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-credential-keys\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.454340 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-scripts\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.454427 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-fernet-keys\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.454534 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqwn8\" (UniqueName: \"kubernetes.io/projected/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-kube-api-access-zqwn8\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.460826 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-scripts\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.461261 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-credential-keys\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.461866 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-config-data\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.465518 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-fernet-keys\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.473367 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqwn8\" (UniqueName: \"kubernetes.io/projected/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-kube-api-access-zqwn8\") pod \"keystone-bootstrap-k8dwb\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.570968 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.993965 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6"] Feb 17 15:35:07 crc kubenswrapper[4806]: I0217 15:35:07.995465 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.000841 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5p4j2" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.017815 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6"] Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.041946 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-k8dwb"] Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.067826 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5kvx\" (UniqueName: \"kubernetes.io/projected/f5a2c8b8-8042-4f62-a5d5-bed880f65261-kube-api-access-v5kvx\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.067883 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-bundle\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.068026 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-util\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.168986 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-util\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.169544 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5kvx\" (UniqueName: \"kubernetes.io/projected/f5a2c8b8-8042-4f62-a5d5-bed880f65261-kube-api-access-v5kvx\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.169643 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-bundle\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.169758 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-util\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.170301 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-bundle\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.185861 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5kvx\" (UniqueName: \"kubernetes.io/projected/f5a2c8b8-8042-4f62-a5d5-bed880f65261-kube-api-access-v5kvx\") pod \"28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.317448 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.812482 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw"] Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.856583 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6"] Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.856804 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.870587 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw"] Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.882101 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-util\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.882164 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58jt\" (UniqueName: \"kubernetes.io/projected/c7062487-b8c9-4591-9d77-395a752598ce-kube-api-access-x58jt\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.882221 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-bundle\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.983991 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-util\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.984063 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58jt\" (UniqueName: \"kubernetes.io/projected/c7062487-b8c9-4591-9d77-395a752598ce-kube-api-access-x58jt\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.984121 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-bundle\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.984683 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-util\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:08 crc kubenswrapper[4806]: I0217 15:35:08.984728 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-bundle\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.004126 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58jt\" (UniqueName: \"kubernetes.io/projected/c7062487-b8c9-4591-9d77-395a752598ce-kube-api-access-x58jt\") pod \"61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.033705 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" event={"ID":"f5a2c8b8-8042-4f62-a5d5-bed880f65261","Type":"ContainerStarted","Data":"02b26598ca1c29eeda191b4d65bd8b7339a619fcaafca8156f902182f7676295"} Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.033817 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" event={"ID":"f5a2c8b8-8042-4f62-a5d5-bed880f65261","Type":"ContainerStarted","Data":"8fc1166029b8875f6be4653c43cb44e3b1c0af86555a1da67a948d5ed2a39f6e"} Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.041961 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" event={"ID":"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1","Type":"ContainerStarted","Data":"535485cb7e686aea0c86925386253f72304d073254f9ec770a712183786aac51"} Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.042014 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" event={"ID":"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1","Type":"ContainerStarted","Data":"8dae4409dd8c85f4cd2ac7e664a36de83862daf9de221eb524966589c125b8a9"} Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.237467 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.690441 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" podStartSLOduration=2.690420997 podStartE2EDuration="2.690420997s" podCreationTimestamp="2026-02-17 15:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:35:09.070845187 +0000 UTC m=+870.601475618" watchObservedRunningTime="2026-02-17 15:35:09.690420997 +0000 UTC m=+871.221051428" Feb 17 15:35:09 crc kubenswrapper[4806]: I0217 15:35:09.692200 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw"] Feb 17 15:35:09 crc kubenswrapper[4806]: W0217 15:35:09.713702 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7062487_b8c9_4591_9d77_395a752598ce.slice/crio-2bbaf61b60f53f9e42ae7a1547243863b90f4b76277fb8697d6e1436f4ca9a42 WatchSource:0}: Error finding container 2bbaf61b60f53f9e42ae7a1547243863b90f4b76277fb8697d6e1436f4ca9a42: Status 404 returned error can't find the container with id 2bbaf61b60f53f9e42ae7a1547243863b90f4b76277fb8697d6e1436f4ca9a42 Feb 17 15:35:10 crc kubenswrapper[4806]: I0217 15:35:10.052289 4806 generic.go:334] "Generic (PLEG): container finished" podID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerID="02b26598ca1c29eeda191b4d65bd8b7339a619fcaafca8156f902182f7676295" exitCode=0 Feb 17 15:35:10 crc kubenswrapper[4806]: I0217 15:35:10.052382 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" event={"ID":"f5a2c8b8-8042-4f62-a5d5-bed880f65261","Type":"ContainerDied","Data":"02b26598ca1c29eeda191b4d65bd8b7339a619fcaafca8156f902182f7676295"} Feb 17 15:35:10 crc kubenswrapper[4806]: I0217 15:35:10.056185 4806 generic.go:334] "Generic (PLEG): container finished" podID="c7062487-b8c9-4591-9d77-395a752598ce" containerID="2db449026d133ae311a6b800de06262bf1e96db4ed114ec30d60bb9f59e65f23" exitCode=0 Feb 17 15:35:10 crc kubenswrapper[4806]: I0217 15:35:10.056943 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" event={"ID":"c7062487-b8c9-4591-9d77-395a752598ce","Type":"ContainerDied","Data":"2db449026d133ae311a6b800de06262bf1e96db4ed114ec30d60bb9f59e65f23"} Feb 17 15:35:10 crc kubenswrapper[4806]: I0217 15:35:10.057019 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" event={"ID":"c7062487-b8c9-4591-9d77-395a752598ce","Type":"ContainerStarted","Data":"2bbaf61b60f53f9e42ae7a1547243863b90f4b76277fb8697d6e1436f4ca9a42"} Feb 17 15:35:11 crc kubenswrapper[4806]: I0217 15:35:11.072175 4806 generic.go:334] "Generic (PLEG): container finished" podID="c7062487-b8c9-4591-9d77-395a752598ce" containerID="f8cbcc4accabeffe1b5926419267f1410a86b63deadb1dea4b8c0d1b90291399" exitCode=0 Feb 17 15:35:11 crc kubenswrapper[4806]: I0217 15:35:11.072694 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" event={"ID":"c7062487-b8c9-4591-9d77-395a752598ce","Type":"ContainerDied","Data":"f8cbcc4accabeffe1b5926419267f1410a86b63deadb1dea4b8c0d1b90291399"} Feb 17 15:35:12 crc kubenswrapper[4806]: I0217 15:35:12.081495 4806 generic.go:334] "Generic (PLEG): container finished" podID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerID="fbd4762553e392de2367055915f54c9c58e8b426385770a2ad1deff795d1470b" exitCode=0 Feb 17 15:35:12 crc kubenswrapper[4806]: I0217 15:35:12.081574 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" event={"ID":"f5a2c8b8-8042-4f62-a5d5-bed880f65261","Type":"ContainerDied","Data":"fbd4762553e392de2367055915f54c9c58e8b426385770a2ad1deff795d1470b"} Feb 17 15:35:12 crc kubenswrapper[4806]: I0217 15:35:12.083716 4806 generic.go:334] "Generic (PLEG): container finished" podID="da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" containerID="535485cb7e686aea0c86925386253f72304d073254f9ec770a712183786aac51" exitCode=0 Feb 17 15:35:12 crc kubenswrapper[4806]: I0217 15:35:12.083775 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" event={"ID":"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1","Type":"ContainerDied","Data":"535485cb7e686aea0c86925386253f72304d073254f9ec770a712183786aac51"} Feb 17 15:35:12 crc kubenswrapper[4806]: I0217 15:35:12.086069 4806 generic.go:334] "Generic (PLEG): container finished" podID="c7062487-b8c9-4591-9d77-395a752598ce" containerID="397ea3156b812fcf9a9d21ad9a4c0ffd5f8fdc47cb033d5541636f94cc69c67d" exitCode=0 Feb 17 15:35:12 crc kubenswrapper[4806]: I0217 15:35:12.086104 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" event={"ID":"c7062487-b8c9-4591-9d77-395a752598ce","Type":"ContainerDied","Data":"397ea3156b812fcf9a9d21ad9a4c0ffd5f8fdc47cb033d5541636f94cc69c67d"} Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.094959 4806 generic.go:334] "Generic (PLEG): container finished" podID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerID="11eadc32269e63720fed3266984b4762e8c481dce060d71a11d75402942720af" exitCode=0 Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.095119 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" event={"ID":"f5a2c8b8-8042-4f62-a5d5-bed880f65261","Type":"ContainerDied","Data":"11eadc32269e63720fed3266984b4762e8c481dce060d71a11d75402942720af"} Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.486398 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.493700 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.553105 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-config-data\") pod \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.553652 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-fernet-keys\") pod \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.553897 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-util\") pod \"c7062487-b8c9-4591-9d77-395a752598ce\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.554227 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqwn8\" (UniqueName: \"kubernetes.io/projected/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-kube-api-access-zqwn8\") pod \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.554598 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x58jt\" (UniqueName: \"kubernetes.io/projected/c7062487-b8c9-4591-9d77-395a752598ce-kube-api-access-x58jt\") pod \"c7062487-b8c9-4591-9d77-395a752598ce\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.554763 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-scripts\") pod \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.554927 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-bundle\") pod \"c7062487-b8c9-4591-9d77-395a752598ce\" (UID: \"c7062487-b8c9-4591-9d77-395a752598ce\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.555248 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-credential-keys\") pod \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\" (UID: \"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1\") " Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.556141 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-bundle" (OuterVolumeSpecName: "bundle") pod "c7062487-b8c9-4591-9d77-395a752598ce" (UID: "c7062487-b8c9-4591-9d77-395a752598ce"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.560784 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7062487-b8c9-4591-9d77-395a752598ce-kube-api-access-x58jt" (OuterVolumeSpecName: "kube-api-access-x58jt") pod "c7062487-b8c9-4591-9d77-395a752598ce" (UID: "c7062487-b8c9-4591-9d77-395a752598ce"). InnerVolumeSpecName "kube-api-access-x58jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.560962 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-scripts" (OuterVolumeSpecName: "scripts") pod "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" (UID: "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.563983 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" (UID: "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.564095 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-kube-api-access-zqwn8" (OuterVolumeSpecName: "kube-api-access-zqwn8") pod "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" (UID: "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1"). InnerVolumeSpecName "kube-api-access-zqwn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.564722 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" (UID: "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.578198 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-config-data" (OuterVolumeSpecName: "config-data") pod "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" (UID: "da2e9089-1a6a-40e3-bb6a-e61f509ed5c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.582201 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-util" (OuterVolumeSpecName: "util") pod "c7062487-b8c9-4591-9d77-395a752598ce" (UID: "c7062487-b8c9-4591-9d77-395a752598ce"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657533 4806 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657589 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657599 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqwn8\" (UniqueName: \"kubernetes.io/projected/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-kube-api-access-zqwn8\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657610 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x58jt\" (UniqueName: \"kubernetes.io/projected/c7062487-b8c9-4591-9d77-395a752598ce-kube-api-access-x58jt\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657637 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657647 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7062487-b8c9-4591-9d77-395a752598ce-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657656 4806 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:13 crc kubenswrapper[4806]: I0217 15:35:13.657665 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.106506 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" event={"ID":"da2e9089-1a6a-40e3-bb6a-e61f509ed5c1","Type":"ContainerDied","Data":"8dae4409dd8c85f4cd2ac7e664a36de83862daf9de221eb524966589c125b8a9"} Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.106564 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dae4409dd8c85f4cd2ac7e664a36de83862daf9de221eb524966589c125b8a9" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.106599 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-k8dwb" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.109064 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" event={"ID":"c7062487-b8c9-4591-9d77-395a752598ce","Type":"ContainerDied","Data":"2bbaf61b60f53f9e42ae7a1547243863b90f4b76277fb8697d6e1436f4ca9a42"} Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.109098 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.109117 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bbaf61b60f53f9e42ae7a1547243863b90f4b76277fb8697d6e1436f4ca9a42" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.245619 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-75c54d45c8-njkpm"] Feb 17 15:35:14 crc kubenswrapper[4806]: E0217 15:35:14.245970 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7062487-b8c9-4591-9d77-395a752598ce" containerName="pull" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.245995 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7062487-b8c9-4591-9d77-395a752598ce" containerName="pull" Feb 17 15:35:14 crc kubenswrapper[4806]: E0217 15:35:14.246011 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" containerName="keystone-bootstrap" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.246019 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" containerName="keystone-bootstrap" Feb 17 15:35:14 crc kubenswrapper[4806]: E0217 15:35:14.246037 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7062487-b8c9-4591-9d77-395a752598ce" containerName="extract" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.246046 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7062487-b8c9-4591-9d77-395a752598ce" containerName="extract" Feb 17 15:35:14 crc kubenswrapper[4806]: E0217 15:35:14.246066 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7062487-b8c9-4591-9d77-395a752598ce" containerName="util" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.246075 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7062487-b8c9-4591-9d77-395a752598ce" containerName="util" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.246225 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" containerName="keystone-bootstrap" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.246247 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7062487-b8c9-4591-9d77-395a752598ce" containerName="extract" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.246812 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.249798 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-zwwkb" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.249844 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.249940 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.249972 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.261212 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-75c54d45c8-njkpm"] Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.266736 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-fernet-keys\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.266800 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-config-data\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.266835 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-credential-keys\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.266874 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-scripts\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.267079 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9dn\" (UniqueName: \"kubernetes.io/projected/93a1bb0d-88da-450c-bea2-ced1b019457b-kube-api-access-ms9dn\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.368013 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9dn\" (UniqueName: \"kubernetes.io/projected/93a1bb0d-88da-450c-bea2-ced1b019457b-kube-api-access-ms9dn\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.368570 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-fernet-keys\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.368619 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-config-data\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.368647 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-credential-keys\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.368683 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-scripts\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.373229 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-credential-keys\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.373631 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-config-data\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.373954 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-scripts\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.374652 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93a1bb0d-88da-450c-bea2-ced1b019457b-fernet-keys\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.392035 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9dn\" (UniqueName: \"kubernetes.io/projected/93a1bb0d-88da-450c-bea2-ced1b019457b-kube-api-access-ms9dn\") pod \"keystone-75c54d45c8-njkpm\" (UID: \"93a1bb0d-88da-450c-bea2-ced1b019457b\") " pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.454904 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.470229 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-bundle" (OuterVolumeSpecName: "bundle") pod "f5a2c8b8-8042-4f62-a5d5-bed880f65261" (UID: "f5a2c8b8-8042-4f62-a5d5-bed880f65261"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.470288 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-bundle\") pod \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.470363 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5kvx\" (UniqueName: \"kubernetes.io/projected/f5a2c8b8-8042-4f62-a5d5-bed880f65261-kube-api-access-v5kvx\") pod \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.470452 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-util\") pod \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\" (UID: \"f5a2c8b8-8042-4f62-a5d5-bed880f65261\") " Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.470716 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.479033 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a2c8b8-8042-4f62-a5d5-bed880f65261-kube-api-access-v5kvx" (OuterVolumeSpecName: "kube-api-access-v5kvx") pod "f5a2c8b8-8042-4f62-a5d5-bed880f65261" (UID: "f5a2c8b8-8042-4f62-a5d5-bed880f65261"). InnerVolumeSpecName "kube-api-access-v5kvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.569738 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.575997 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5kvx\" (UniqueName: \"kubernetes.io/projected/f5a2c8b8-8042-4f62-a5d5-bed880f65261-kube-api-access-v5kvx\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.792097 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-util" (OuterVolumeSpecName: "util") pod "f5a2c8b8-8042-4f62-a5d5-bed880f65261" (UID: "f5a2c8b8-8042-4f62-a5d5-bed880f65261"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:35:14 crc kubenswrapper[4806]: I0217 15:35:14.882656 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5a2c8b8-8042-4f62-a5d5-bed880f65261-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:15 crc kubenswrapper[4806]: W0217 15:35:15.013426 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93a1bb0d_88da_450c_bea2_ced1b019457b.slice/crio-db400f9f53ce020debd45f32d38cd7195d65bc8442a189f7e4dd9ad49a7d65cb WatchSource:0}: Error finding container db400f9f53ce020debd45f32d38cd7195d65bc8442a189f7e4dd9ad49a7d65cb: Status 404 returned error can't find the container with id db400f9f53ce020debd45f32d38cd7195d65bc8442a189f7e4dd9ad49a7d65cb Feb 17 15:35:15 crc kubenswrapper[4806]: I0217 15:35:15.015603 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-75c54d45c8-njkpm"] Feb 17 15:35:15 crc kubenswrapper[4806]: I0217 15:35:15.119730 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" Feb 17 15:35:15 crc kubenswrapper[4806]: I0217 15:35:15.119757 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6" event={"ID":"f5a2c8b8-8042-4f62-a5d5-bed880f65261","Type":"ContainerDied","Data":"8fc1166029b8875f6be4653c43cb44e3b1c0af86555a1da67a948d5ed2a39f6e"} Feb 17 15:35:15 crc kubenswrapper[4806]: I0217 15:35:15.119797 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fc1166029b8875f6be4653c43cb44e3b1c0af86555a1da67a948d5ed2a39f6e" Feb 17 15:35:15 crc kubenswrapper[4806]: I0217 15:35:15.120640 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" event={"ID":"93a1bb0d-88da-450c-bea2-ced1b019457b","Type":"ContainerStarted","Data":"db400f9f53ce020debd45f32d38cd7195d65bc8442a189f7e4dd9ad49a7d65cb"} Feb 17 15:35:16 crc kubenswrapper[4806]: I0217 15:35:16.134452 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" event={"ID":"93a1bb0d-88da-450c-bea2-ced1b019457b","Type":"ContainerStarted","Data":"82dd0b947cab341b8f33f714f522d6b01e92e8a4b2f090d651b5a0d094682c11"} Feb 17 15:35:16 crc kubenswrapper[4806]: I0217 15:35:16.134837 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:16 crc kubenswrapper[4806]: I0217 15:35:16.162566 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" podStartSLOduration=2.162544039 podStartE2EDuration="2.162544039s" podCreationTimestamp="2026-02-17 15:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:35:16.156570402 +0000 UTC m=+877.687200893" watchObservedRunningTime="2026-02-17 15:35:16.162544039 +0000 UTC m=+877.693174470" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.310089 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55"] Feb 17 15:35:27 crc kubenswrapper[4806]: E0217 15:35:27.311101 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerName="util" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.311179 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerName="util" Feb 17 15:35:27 crc kubenswrapper[4806]: E0217 15:35:27.311197 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerName="extract" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.311205 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerName="extract" Feb 17 15:35:27 crc kubenswrapper[4806]: E0217 15:35:27.311226 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerName="pull" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.311234 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerName="pull" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.311347 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a2c8b8-8042-4f62-a5d5-bed880f65261" containerName="extract" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.311796 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.315339 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.316592 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xbtc5" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.328808 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55"] Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.388222 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-apiservice-cert\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.388731 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-webhook-cert\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.388850 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntksd\" (UniqueName: \"kubernetes.io/projected/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-kube-api-access-ntksd\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.490961 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntksd\" (UniqueName: \"kubernetes.io/projected/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-kube-api-access-ntksd\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.491094 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-apiservice-cert\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.491124 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-webhook-cert\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.498640 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-apiservice-cert\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.498824 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-webhook-cert\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.509617 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntksd\" (UniqueName: \"kubernetes.io/projected/c47438c6-0196-42f5-8f8f-bf5e9ed6df78-kube-api-access-ntksd\") pod \"swift-operator-controller-manager-69cdff58cd-ggj55\" (UID: \"c47438c6-0196-42f5-8f8f-bf5e9ed6df78\") " pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:27 crc kubenswrapper[4806]: I0217 15:35:27.645635 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:28 crc kubenswrapper[4806]: I0217 15:35:28.089135 4806 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:35:28 crc kubenswrapper[4806]: I0217 15:35:28.091938 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55"] Feb 17 15:35:28 crc kubenswrapper[4806]: I0217 15:35:28.226019 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" event={"ID":"c47438c6-0196-42f5-8f8f-bf5e9ed6df78","Type":"ContainerStarted","Data":"2ac73af2f802ff9d563bb4c348ca4947bbbf6e9c61f95dede693fe3cc116638e"} Feb 17 15:35:31 crc kubenswrapper[4806]: I0217 15:35:31.248556 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" event={"ID":"c47438c6-0196-42f5-8f8f-bf5e9ed6df78","Type":"ContainerStarted","Data":"d512c2999f80c5a45d26dcbed0240d8a900a58fa7bbfd7df7af45fd54f8a47a1"} Feb 17 15:35:31 crc kubenswrapper[4806]: I0217 15:35:31.249223 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:31 crc kubenswrapper[4806]: I0217 15:35:31.270212 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" podStartSLOduration=1.97400061 podStartE2EDuration="4.270192129s" podCreationTimestamp="2026-02-17 15:35:27 +0000 UTC" firstStartedPulling="2026-02-17 15:35:28.088834941 +0000 UTC m=+889.619465352" lastFinishedPulling="2026-02-17 15:35:30.38502646 +0000 UTC m=+891.915656871" observedRunningTime="2026-02-17 15:35:31.263859873 +0000 UTC m=+892.794490294" watchObservedRunningTime="2026-02-17 15:35:31.270192129 +0000 UTC m=+892.800822540" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.019210 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257"] Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.022574 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.026815 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.026877 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-92bkx" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.053425 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257"] Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.129499 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnfw2\" (UniqueName: \"kubernetes.io/projected/c51c0d3e-e13f-4cdb-a842-d27644641a79-kube-api-access-fnfw2\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.129562 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c51c0d3e-e13f-4cdb-a842-d27644641a79-webhook-cert\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.129833 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c51c0d3e-e13f-4cdb-a842-d27644641a79-apiservice-cert\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.231801 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c51c0d3e-e13f-4cdb-a842-d27644641a79-apiservice-cert\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.231904 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnfw2\" (UniqueName: \"kubernetes.io/projected/c51c0d3e-e13f-4cdb-a842-d27644641a79-kube-api-access-fnfw2\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.231945 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c51c0d3e-e13f-4cdb-a842-d27644641a79-webhook-cert\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.238688 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c51c0d3e-e13f-4cdb-a842-d27644641a79-webhook-cert\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.238799 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c51c0d3e-e13f-4cdb-a842-d27644641a79-apiservice-cert\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.261043 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnfw2\" (UniqueName: \"kubernetes.io/projected/c51c0d3e-e13f-4cdb-a842-d27644641a79-kube-api-access-fnfw2\") pod \"horizon-operator-controller-manager-7684c4dfd4-hc257\" (UID: \"c51c0d3e-e13f-4cdb-a842-d27644641a79\") " pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.355983 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.651442 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-69cdff58cd-ggj55" Feb 17 15:35:37 crc kubenswrapper[4806]: I0217 15:35:37.807444 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257"] Feb 17 15:35:38 crc kubenswrapper[4806]: I0217 15:35:38.298276 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" event={"ID":"c51c0d3e-e13f-4cdb-a842-d27644641a79","Type":"ContainerStarted","Data":"84b9595a965b40242b80eb75d97723334afdab2a41d3f84ca8d88ad17ee74cb0"} Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.577419 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.581178 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.583298 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.584065 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.584321 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-tt92r" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.584480 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.637325 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.672519 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.672594 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.672614 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/83699dfd-16c6-425d-b761-26b3635984ae-lock\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.672650 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvxj4\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-kube-api-access-cvxj4\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.672679 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/83699dfd-16c6-425d-b761-26b3635984ae-cache\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.773838 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvxj4\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-kube-api-access-cvxj4\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.773916 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/83699dfd-16c6-425d-b761-26b3635984ae-cache\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.773987 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.774027 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.774050 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/83699dfd-16c6-425d-b761-26b3635984ae-lock\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: E0217 15:35:39.774466 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:39 crc kubenswrapper[4806]: E0217 15:35:39.774623 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.774640 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/83699dfd-16c6-425d-b761-26b3635984ae-cache\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.774724 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/83699dfd-16c6-425d-b761-26b3635984ae-lock\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: E0217 15:35:39.774867 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift podName:83699dfd-16c6-425d-b761-26b3635984ae nodeName:}" failed. No retries permitted until 2026-02-17 15:35:40.274727103 +0000 UTC m=+901.805357504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift") pod "swift-storage-0" (UID: "83699dfd-16c6-425d-b761-26b3635984ae") : configmap "swift-ring-files" not found Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.775255 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.800244 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvxj4\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-kube-api-access-cvxj4\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:39 crc kubenswrapper[4806]: I0217 15:35:39.805864 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.206362 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-zl4m6"] Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.216777 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.218654 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.219202 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.219629 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-zl4m6"] Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.225294 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.281683 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46887851-3f0c-4edf-ad3f-87602700b860-etc-swift\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.281765 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-dispersionconf\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.281821 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z4b\" (UniqueName: \"kubernetes.io/projected/46887851-3f0c-4edf-ad3f-87602700b860-kube-api-access-q6z4b\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.281910 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-scripts\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.281982 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.282001 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-ring-data-devices\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.282018 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-swiftconf\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: E0217 15:35:40.282159 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:40 crc kubenswrapper[4806]: E0217 15:35:40.282201 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:35:40 crc kubenswrapper[4806]: E0217 15:35:40.282262 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift podName:83699dfd-16c6-425d-b761-26b3635984ae nodeName:}" failed. No retries permitted until 2026-02-17 15:35:41.282241789 +0000 UTC m=+902.812872270 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift") pod "swift-storage-0" (UID: "83699dfd-16c6-425d-b761-26b3635984ae") : configmap "swift-ring-files" not found Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.383692 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-dispersionconf\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.384060 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6z4b\" (UniqueName: \"kubernetes.io/projected/46887851-3f0c-4edf-ad3f-87602700b860-kube-api-access-q6z4b\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.384107 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-scripts\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.384168 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-ring-data-devices\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.384184 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-swiftconf\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.384207 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46887851-3f0c-4edf-ad3f-87602700b860-etc-swift\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.384617 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46887851-3f0c-4edf-ad3f-87602700b860-etc-swift\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.385607 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-ring-data-devices\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.385909 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-scripts\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.387925 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-dispersionconf\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.391131 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-swiftconf\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.401960 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6z4b\" (UniqueName: \"kubernetes.io/projected/46887851-3f0c-4edf-ad3f-87602700b860-kube-api-access-q6z4b\") pod \"swift-ring-rebalance-zl4m6\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.533511 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.973725 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zvrtb"] Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.983979 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.986414 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvrtb"] Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.996486 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-utilities\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.996562 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-catalog-content\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:40 crc kubenswrapper[4806]: I0217 15:35:40.996653 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wld\" (UniqueName: \"kubernetes.io/projected/83e7972c-0c91-4309-af07-16e729bb8c84-kube-api-access-j4wld\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.097649 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wld\" (UniqueName: \"kubernetes.io/projected/83e7972c-0c91-4309-af07-16e729bb8c84-kube-api-access-j4wld\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.097697 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-utilities\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.097738 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-catalog-content\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.098241 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-catalog-content\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.098533 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-utilities\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.123491 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wld\" (UniqueName: \"kubernetes.io/projected/83e7972c-0c91-4309-af07-16e729bb8c84-kube-api-access-j4wld\") pod \"redhat-operators-zvrtb\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.275645 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-zl4m6"] Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.301702 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:41 crc kubenswrapper[4806]: E0217 15:35:41.301981 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:41 crc kubenswrapper[4806]: E0217 15:35:41.302024 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:35:41 crc kubenswrapper[4806]: E0217 15:35:41.302094 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift podName:83699dfd-16c6-425d-b761-26b3635984ae nodeName:}" failed. No retries permitted until 2026-02-17 15:35:43.302065939 +0000 UTC m=+904.832696340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift") pod "swift-storage-0" (UID: "83699dfd-16c6-425d-b761-26b3635984ae") : configmap "swift-ring-files" not found Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.311082 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.322110 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" event={"ID":"46887851-3f0c-4edf-ad3f-87602700b860","Type":"ContainerStarted","Data":"1ca7fda1e985d652000371d95c1c11ed6c49f9e4bf95de0d3aa4867f3ff3654e"} Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.323705 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" event={"ID":"c51c0d3e-e13f-4cdb-a842-d27644641a79","Type":"ContainerStarted","Data":"14688d7b118332a9ee15bff1b8842d2d1b3352ad3c50f13be51a1a5dfbafb1a6"} Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.324938 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.370241 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" podStartSLOduration=2.446066735 podStartE2EDuration="5.370219s" podCreationTimestamp="2026-02-17 15:35:36 +0000 UTC" firstStartedPulling="2026-02-17 15:35:37.818994441 +0000 UTC m=+899.349624852" lastFinishedPulling="2026-02-17 15:35:40.743146706 +0000 UTC m=+902.273777117" observedRunningTime="2026-02-17 15:35:41.366244882 +0000 UTC m=+902.896875293" watchObservedRunningTime="2026-02-17 15:35:41.370219 +0000 UTC m=+902.900849411" Feb 17 15:35:41 crc kubenswrapper[4806]: I0217 15:35:41.797040 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zvrtb"] Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.335595 4806 generic.go:334] "Generic (PLEG): container finished" podID="83e7972c-0c91-4309-af07-16e729bb8c84" containerID="f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289" exitCode=0 Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.335688 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrtb" event={"ID":"83e7972c-0c91-4309-af07-16e729bb8c84","Type":"ContainerDied","Data":"f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289"} Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.335748 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrtb" event={"ID":"83e7972c-0c91-4309-af07-16e729bb8c84","Type":"ContainerStarted","Data":"4d3784bd67ef46d94f1ff03e278aa692de42d5858e68be12993a3d7336206115"} Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.368864 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-5v785"] Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.369719 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.371917 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-7bqwk" Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.384217 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-5v785"] Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.430948 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sngtn\" (UniqueName: \"kubernetes.io/projected/96c88e0a-5c93-40c3-b3d4-91cfdb8b6148-kube-api-access-sngtn\") pod \"glance-operator-index-5v785\" (UID: \"96c88e0a-5c93-40c3-b3d4-91cfdb8b6148\") " pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.533050 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sngtn\" (UniqueName: \"kubernetes.io/projected/96c88e0a-5c93-40c3-b3d4-91cfdb8b6148-kube-api-access-sngtn\") pod \"glance-operator-index-5v785\" (UID: \"96c88e0a-5c93-40c3-b3d4-91cfdb8b6148\") " pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.568184 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sngtn\" (UniqueName: \"kubernetes.io/projected/96c88e0a-5c93-40c3-b3d4-91cfdb8b6148-kube-api-access-sngtn\") pod \"glance-operator-index-5v785\" (UID: \"96c88e0a-5c93-40c3-b3d4-91cfdb8b6148\") " pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:42 crc kubenswrapper[4806]: I0217 15:35:42.687558 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:43 crc kubenswrapper[4806]: I0217 15:35:43.150353 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-5v785"] Feb 17 15:35:43 crc kubenswrapper[4806]: I0217 15:35:43.343760 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-5v785" event={"ID":"96c88e0a-5c93-40c3-b3d4-91cfdb8b6148","Type":"ContainerStarted","Data":"3fe98a28b98352feaf63c6cc2c4f3248ad20481affb5b36f02e426a52804dd49"} Feb 17 15:35:43 crc kubenswrapper[4806]: I0217 15:35:43.346013 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:43 crc kubenswrapper[4806]: E0217 15:35:43.346206 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:43 crc kubenswrapper[4806]: E0217 15:35:43.346239 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:35:43 crc kubenswrapper[4806]: E0217 15:35:43.346289 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift podName:83699dfd-16c6-425d-b761-26b3635984ae nodeName:}" failed. No retries permitted until 2026-02-17 15:35:47.346271643 +0000 UTC m=+908.876902054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift") pod "swift-storage-0" (UID: "83699dfd-16c6-425d-b761-26b3635984ae") : configmap "swift-ring-files" not found Feb 17 15:35:43 crc kubenswrapper[4806]: I0217 15:35:43.347382 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrtb" event={"ID":"83e7972c-0c91-4309-af07-16e729bb8c84","Type":"ContainerStarted","Data":"5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f"} Feb 17 15:35:44 crc kubenswrapper[4806]: I0217 15:35:44.359311 4806 generic.go:334] "Generic (PLEG): container finished" podID="83e7972c-0c91-4309-af07-16e729bb8c84" containerID="5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f" exitCode=0 Feb 17 15:35:44 crc kubenswrapper[4806]: I0217 15:35:44.359352 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrtb" event={"ID":"83e7972c-0c91-4309-af07-16e729bb8c84","Type":"ContainerDied","Data":"5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f"} Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.763379 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2x94t"] Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.765230 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.789296 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2x94t"] Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.795553 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-catalog-content\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.795597 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-utilities\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.795615 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxlh7\" (UniqueName: \"kubernetes.io/projected/4aa0bac5-5565-409b-8f08-18bf764c2a09-kube-api-access-cxlh7\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.897057 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-catalog-content\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.897101 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxlh7\" (UniqueName: \"kubernetes.io/projected/4aa0bac5-5565-409b-8f08-18bf764c2a09-kube-api-access-cxlh7\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.897121 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-utilities\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.897846 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-utilities\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.897932 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-catalog-content\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:45 crc kubenswrapper[4806]: I0217 15:35:45.920595 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxlh7\" (UniqueName: \"kubernetes.io/projected/4aa0bac5-5565-409b-8f08-18bf764c2a09-kube-api-access-cxlh7\") pod \"community-operators-2x94t\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:46 crc kubenswrapper[4806]: I0217 15:35:46.055987 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-75c54d45c8-njkpm" Feb 17 15:35:46 crc kubenswrapper[4806]: I0217 15:35:46.087000 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:46 crc kubenswrapper[4806]: I0217 15:35:46.383187 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" event={"ID":"46887851-3f0c-4edf-ad3f-87602700b860","Type":"ContainerStarted","Data":"86026e9ed7aa087a8ad6c3e4f58cceaaf6204666b4fe85573c028b929db94dce"} Feb 17 15:35:46 crc kubenswrapper[4806]: I0217 15:35:46.405658 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" podStartSLOduration=2.094311788 podStartE2EDuration="6.405642422s" podCreationTimestamp="2026-02-17 15:35:40 +0000 UTC" firstStartedPulling="2026-02-17 15:35:41.283483171 +0000 UTC m=+902.814113582" lastFinishedPulling="2026-02-17 15:35:45.594813785 +0000 UTC m=+907.125444216" observedRunningTime="2026-02-17 15:35:46.402266038 +0000 UTC m=+907.932896449" watchObservedRunningTime="2026-02-17 15:35:46.405642422 +0000 UTC m=+907.936272833" Feb 17 15:35:47 crc kubenswrapper[4806]: I0217 15:35:47.360554 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7684c4dfd4-hc257" Feb 17 15:35:47 crc kubenswrapper[4806]: I0217 15:35:47.428434 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:47 crc kubenswrapper[4806]: E0217 15:35:47.428602 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:47 crc kubenswrapper[4806]: E0217 15:35:47.428620 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:35:47 crc kubenswrapper[4806]: E0217 15:35:47.428666 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift podName:83699dfd-16c6-425d-b761-26b3635984ae nodeName:}" failed. No retries permitted until 2026-02-17 15:35:55.428648571 +0000 UTC m=+916.959278982 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift") pod "swift-storage-0" (UID: "83699dfd-16c6-425d-b761-26b3635984ae") : configmap "swift-ring-files" not found Feb 17 15:35:47 crc kubenswrapper[4806]: I0217 15:35:47.460508 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2x94t"] Feb 17 15:35:47 crc kubenswrapper[4806]: W0217 15:35:47.844631 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa0bac5_5565_409b_8f08_18bf764c2a09.slice/crio-6d073f4ced84db72c69636cfcf2bb166f370024e528b59c434092af5d2ec6ab0 WatchSource:0}: Error finding container 6d073f4ced84db72c69636cfcf2bb166f370024e528b59c434092af5d2ec6ab0: Status 404 returned error can't find the container with id 6d073f4ced84db72c69636cfcf2bb166f370024e528b59c434092af5d2ec6ab0 Feb 17 15:35:47 crc kubenswrapper[4806]: I0217 15:35:47.965832 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tsxl6"] Feb 17 15:35:47 crc kubenswrapper[4806]: I0217 15:35:47.967260 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:47 crc kubenswrapper[4806]: I0217 15:35:47.984219 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsxl6"] Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.038364 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-catalog-content\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.038458 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j2nt\" (UniqueName: \"kubernetes.io/projected/5b74ca1f-39bc-46b3-858f-311bee5fc691-kube-api-access-2j2nt\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.038594 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-utilities\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.139943 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-utilities\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.141081 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-utilities\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.141525 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-catalog-content\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.142026 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-catalog-content\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.142229 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2nt\" (UniqueName: \"kubernetes.io/projected/5b74ca1f-39bc-46b3-858f-311bee5fc691-kube-api-access-2j2nt\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.175362 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2nt\" (UniqueName: \"kubernetes.io/projected/5b74ca1f-39bc-46b3-858f-311bee5fc691-kube-api-access-2j2nt\") pod \"certified-operators-tsxl6\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.414363 4806 generic.go:334] "Generic (PLEG): container finished" podID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerID="ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac" exitCode=0 Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.414465 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2x94t" event={"ID":"4aa0bac5-5565-409b-8f08-18bf764c2a09","Type":"ContainerDied","Data":"ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac"} Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.414505 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2x94t" event={"ID":"4aa0bac5-5565-409b-8f08-18bf764c2a09","Type":"ContainerStarted","Data":"6d073f4ced84db72c69636cfcf2bb166f370024e528b59c434092af5d2ec6ab0"} Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.419801 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-5v785" event={"ID":"96c88e0a-5c93-40c3-b3d4-91cfdb8b6148","Type":"ContainerStarted","Data":"881dd082c0d59b1f701fdcdf0e8821e7730eee63805da7222052ac2c8a1bc268"} Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.423123 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrtb" event={"ID":"83e7972c-0c91-4309-af07-16e729bb8c84","Type":"ContainerStarted","Data":"68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a"} Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.436011 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.508678 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-5v785" podStartSLOduration=1.764311162 podStartE2EDuration="6.508649335s" podCreationTimestamp="2026-02-17 15:35:42 +0000 UTC" firstStartedPulling="2026-02-17 15:35:43.182660098 +0000 UTC m=+904.713290509" lastFinishedPulling="2026-02-17 15:35:47.926998271 +0000 UTC m=+909.457628682" observedRunningTime="2026-02-17 15:35:48.491883351 +0000 UTC m=+910.022513772" watchObservedRunningTime="2026-02-17 15:35:48.508649335 +0000 UTC m=+910.039279746" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.539167 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zvrtb" podStartSLOduration=3.035970229 podStartE2EDuration="8.539140467s" podCreationTimestamp="2026-02-17 15:35:40 +0000 UTC" firstStartedPulling="2026-02-17 15:35:42.339106814 +0000 UTC m=+903.869737225" lastFinishedPulling="2026-02-17 15:35:47.842277052 +0000 UTC m=+909.372907463" observedRunningTime="2026-02-17 15:35:48.516592021 +0000 UTC m=+910.047222462" watchObservedRunningTime="2026-02-17 15:35:48.539140467 +0000 UTC m=+910.069770878" Feb 17 15:35:48 crc kubenswrapper[4806]: I0217 15:35:48.935396 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tsxl6"] Feb 17 15:35:49 crc kubenswrapper[4806]: I0217 15:35:49.438133 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerStarted","Data":"b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a"} Feb 17 15:35:49 crc kubenswrapper[4806]: I0217 15:35:49.438654 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerStarted","Data":"8fb2d86fa32038d0400cca0cbba45e3e0f1d944c0f6f8b810a84b2ef3f355fcf"} Feb 17 15:35:50 crc kubenswrapper[4806]: I0217 15:35:50.444547 4806 generic.go:334] "Generic (PLEG): container finished" podID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerID="b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a" exitCode=0 Feb 17 15:35:50 crc kubenswrapper[4806]: I0217 15:35:50.444635 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerDied","Data":"b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a"} Feb 17 15:35:50 crc kubenswrapper[4806]: I0217 15:35:50.447768 4806 generic.go:334] "Generic (PLEG): container finished" podID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerID="e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee" exitCode=0 Feb 17 15:35:50 crc kubenswrapper[4806]: I0217 15:35:50.447810 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2x94t" event={"ID":"4aa0bac5-5565-409b-8f08-18bf764c2a09","Type":"ContainerDied","Data":"e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee"} Feb 17 15:35:51 crc kubenswrapper[4806]: I0217 15:35:51.311521 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:51 crc kubenswrapper[4806]: I0217 15:35:51.312073 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:35:51 crc kubenswrapper[4806]: I0217 15:35:51.458953 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerStarted","Data":"7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b"} Feb 17 15:35:51 crc kubenswrapper[4806]: I0217 15:35:51.461848 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2x94t" event={"ID":"4aa0bac5-5565-409b-8f08-18bf764c2a09","Type":"ContainerStarted","Data":"baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454"} Feb 17 15:35:51 crc kubenswrapper[4806]: I0217 15:35:51.506228 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2x94t" podStartSLOduration=4.077558815 podStartE2EDuration="6.50620916s" podCreationTimestamp="2026-02-17 15:35:45 +0000 UTC" firstStartedPulling="2026-02-17 15:35:48.41601512 +0000 UTC m=+909.946645531" lastFinishedPulling="2026-02-17 15:35:50.844665465 +0000 UTC m=+912.375295876" observedRunningTime="2026-02-17 15:35:51.504602261 +0000 UTC m=+913.035232682" watchObservedRunningTime="2026-02-17 15:35:51.50620916 +0000 UTC m=+913.036839561" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.363724 4806 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zvrtb" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="registry-server" probeResult="failure" output=< Feb 17 15:35:52 crc kubenswrapper[4806]: timeout: failed to connect service ":50051" within 1s Feb 17 15:35:52 crc kubenswrapper[4806]: > Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.488862 4806 generic.go:334] "Generic (PLEG): container finished" podID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerID="7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b" exitCode=0 Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.489805 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerDied","Data":"7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b"} Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.546805 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht"] Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.548178 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.563435 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht"] Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.616873 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5813437e-d2ad-4742-8598-5d78f8026604-log-httpd\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.616973 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813437e-d2ad-4742-8598-5d78f8026604-config-data\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.617031 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.617165 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vmb\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-kube-api-access-55vmb\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.617242 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5813437e-d2ad-4742-8598-5d78f8026604-run-httpd\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.688125 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.688218 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.718719 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55vmb\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-kube-api-access-55vmb\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.718797 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5813437e-d2ad-4742-8598-5d78f8026604-run-httpd\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.718857 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5813437e-d2ad-4742-8598-5d78f8026604-log-httpd\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.719389 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5813437e-d2ad-4742-8598-5d78f8026604-run-httpd\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.719616 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5813437e-d2ad-4742-8598-5d78f8026604-log-httpd\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.719633 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813437e-d2ad-4742-8598-5d78f8026604-config-data\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.719787 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: E0217 15:35:52.719923 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:52 crc kubenswrapper[4806]: E0217 15:35:52.719946 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht: configmap "swift-ring-files" not found Feb 17 15:35:52 crc kubenswrapper[4806]: E0217 15:35:52.719999 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift podName:5813437e-d2ad-4742-8598-5d78f8026604 nodeName:}" failed. No retries permitted until 2026-02-17 15:35:53.219980443 +0000 UTC m=+914.750610854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift") pod "swift-proxy-5f6df75b65-sh9ht" (UID: "5813437e-d2ad-4742-8598-5d78f8026604") : configmap "swift-ring-files" not found Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.734069 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5813437e-d2ad-4742-8598-5d78f8026604-config-data\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.740382 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vmb\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-kube-api-access-55vmb\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:52 crc kubenswrapper[4806]: I0217 15:35:52.752299 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:53 crc kubenswrapper[4806]: I0217 15:35:53.229248 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:53 crc kubenswrapper[4806]: E0217 15:35:53.229346 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:53 crc kubenswrapper[4806]: E0217 15:35:53.229998 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht: configmap "swift-ring-files" not found Feb 17 15:35:53 crc kubenswrapper[4806]: E0217 15:35:53.230113 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift podName:5813437e-d2ad-4742-8598-5d78f8026604 nodeName:}" failed. No retries permitted until 2026-02-17 15:35:54.230085053 +0000 UTC m=+915.760715464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift") pod "swift-proxy-5f6df75b65-sh9ht" (UID: "5813437e-d2ad-4742-8598-5d78f8026604") : configmap "swift-ring-files" not found Feb 17 15:35:53 crc kubenswrapper[4806]: I0217 15:35:53.496657 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerStarted","Data":"f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558"} Feb 17 15:35:53 crc kubenswrapper[4806]: I0217 15:35:53.518822 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tsxl6" podStartSLOduration=4.0390732 podStartE2EDuration="6.518808144s" podCreationTimestamp="2026-02-17 15:35:47 +0000 UTC" firstStartedPulling="2026-02-17 15:35:50.446149367 +0000 UTC m=+911.976779778" lastFinishedPulling="2026-02-17 15:35:52.925884311 +0000 UTC m=+914.456514722" observedRunningTime="2026-02-17 15:35:53.516806164 +0000 UTC m=+915.047436595" watchObservedRunningTime="2026-02-17 15:35:53.518808144 +0000 UTC m=+915.049438555" Feb 17 15:35:53 crc kubenswrapper[4806]: I0217 15:35:53.532608 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-5v785" Feb 17 15:35:54 crc kubenswrapper[4806]: I0217 15:35:54.245855 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:54 crc kubenswrapper[4806]: E0217 15:35:54.246041 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:54 crc kubenswrapper[4806]: E0217 15:35:54.246083 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht: configmap "swift-ring-files" not found Feb 17 15:35:54 crc kubenswrapper[4806]: E0217 15:35:54.246144 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift podName:5813437e-d2ad-4742-8598-5d78f8026604 nodeName:}" failed. No retries permitted until 2026-02-17 15:35:56.246123091 +0000 UTC m=+917.776753502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift") pod "swift-proxy-5f6df75b65-sh9ht" (UID: "5813437e-d2ad-4742-8598-5d78f8026604") : configmap "swift-ring-files" not found Feb 17 15:35:55 crc kubenswrapper[4806]: I0217 15:35:55.464031 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:35:55 crc kubenswrapper[4806]: E0217 15:35:55.464329 4806 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Feb 17 15:35:55 crc kubenswrapper[4806]: E0217 15:35:55.464609 4806 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Feb 17 15:35:55 crc kubenswrapper[4806]: E0217 15:35:55.464692 4806 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift podName:83699dfd-16c6-425d-b761-26b3635984ae nodeName:}" failed. No retries permitted until 2026-02-17 15:36:11.464665821 +0000 UTC m=+932.995296232 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift") pod "swift-storage-0" (UID: "83699dfd-16c6-425d-b761-26b3635984ae") : configmap "swift-ring-files" not found Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.087990 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.088352 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.147248 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.278099 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.285490 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5813437e-d2ad-4742-8598-5d78f8026604-etc-swift\") pod \"swift-proxy-5f6df75b65-sh9ht\" (UID: \"5813437e-d2ad-4742-8598-5d78f8026604\") " pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.473935 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.520525 4806 generic.go:334] "Generic (PLEG): container finished" podID="46887851-3f0c-4edf-ad3f-87602700b860" containerID="86026e9ed7aa087a8ad6c3e4f58cceaaf6204666b4fe85573c028b929db94dce" exitCode=0 Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.521259 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" event={"ID":"46887851-3f0c-4edf-ad3f-87602700b860","Type":"ContainerDied","Data":"86026e9ed7aa087a8ad6c3e4f58cceaaf6204666b4fe85573c028b929db94dce"} Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.568815 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:35:56 crc kubenswrapper[4806]: W0217 15:35:56.952531 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5813437e_d2ad_4742_8598_5d78f8026604.slice/crio-95dc2b23dc7fb24fd2335744234b4518a06c8dc455c6bfd6a712df957f589f7a WatchSource:0}: Error finding container 95dc2b23dc7fb24fd2335744234b4518a06c8dc455c6bfd6a712df957f589f7a: Status 404 returned error can't find the container with id 95dc2b23dc7fb24fd2335744234b4518a06c8dc455c6bfd6a712df957f589f7a Feb 17 15:35:56 crc kubenswrapper[4806]: I0217 15:35:56.953639 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht"] Feb 17 15:35:57 crc kubenswrapper[4806]: I0217 15:35:57.551182 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" event={"ID":"5813437e-d2ad-4742-8598-5d78f8026604","Type":"ContainerStarted","Data":"ea6769066cdeb77b91f86e03780a7d380590eb03c7bb34f200043c58d9a88883"} Feb 17 15:35:57 crc kubenswrapper[4806]: I0217 15:35:57.551691 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" event={"ID":"5813437e-d2ad-4742-8598-5d78f8026604","Type":"ContainerStarted","Data":"1423a9279ea1aa997757354eded3820f97876e6948c6c1a0aeeac722e7a98a0b"} Feb 17 15:35:57 crc kubenswrapper[4806]: I0217 15:35:57.551716 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" event={"ID":"5813437e-d2ad-4742-8598-5d78f8026604","Type":"ContainerStarted","Data":"95dc2b23dc7fb24fd2335744234b4518a06c8dc455c6bfd6a712df957f589f7a"} Feb 17 15:35:57 crc kubenswrapper[4806]: I0217 15:35:57.551886 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:57 crc kubenswrapper[4806]: I0217 15:35:57.551911 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:35:57 crc kubenswrapper[4806]: I0217 15:35:57.603094 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" podStartSLOduration=5.603078429 podStartE2EDuration="5.603078429s" podCreationTimestamp="2026-02-17 15:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:35:57.601611842 +0000 UTC m=+919.132242273" watchObservedRunningTime="2026-02-17 15:35:57.603078429 +0000 UTC m=+919.133708840" Feb 17 15:35:57 crc kubenswrapper[4806]: I0217 15:35:57.974986 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.009202 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-ring-data-devices\") pod \"46887851-3f0c-4edf-ad3f-87602700b860\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.009358 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6z4b\" (UniqueName: \"kubernetes.io/projected/46887851-3f0c-4edf-ad3f-87602700b860-kube-api-access-q6z4b\") pod \"46887851-3f0c-4edf-ad3f-87602700b860\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.009442 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-dispersionconf\") pod \"46887851-3f0c-4edf-ad3f-87602700b860\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.009474 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-swiftconf\") pod \"46887851-3f0c-4edf-ad3f-87602700b860\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.010699 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-scripts\") pod \"46887851-3f0c-4edf-ad3f-87602700b860\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.010830 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46887851-3f0c-4edf-ad3f-87602700b860-etc-swift\") pod \"46887851-3f0c-4edf-ad3f-87602700b860\" (UID: \"46887851-3f0c-4edf-ad3f-87602700b860\") " Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.009724 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "46887851-3f0c-4edf-ad3f-87602700b860" (UID: "46887851-3f0c-4edf-ad3f-87602700b860"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.012065 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46887851-3f0c-4edf-ad3f-87602700b860-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "46887851-3f0c-4edf-ad3f-87602700b860" (UID: "46887851-3f0c-4edf-ad3f-87602700b860"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.016419 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46887851-3f0c-4edf-ad3f-87602700b860-kube-api-access-q6z4b" (OuterVolumeSpecName: "kube-api-access-q6z4b") pod "46887851-3f0c-4edf-ad3f-87602700b860" (UID: "46887851-3f0c-4edf-ad3f-87602700b860"). InnerVolumeSpecName "kube-api-access-q6z4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.026299 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "46887851-3f0c-4edf-ad3f-87602700b860" (UID: "46887851-3f0c-4edf-ad3f-87602700b860"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.036981 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-scripts" (OuterVolumeSpecName: "scripts") pod "46887851-3f0c-4edf-ad3f-87602700b860" (UID: "46887851-3f0c-4edf-ad3f-87602700b860"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.039675 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "46887851-3f0c-4edf-ad3f-87602700b860" (UID: "46887851-3f0c-4edf-ad3f-87602700b860"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.112741 4806 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.112790 4806 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46887851-3f0c-4edf-ad3f-87602700b860-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.112803 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.112815 4806 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46887851-3f0c-4edf-ad3f-87602700b860-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.112828 4806 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46887851-3f0c-4edf-ad3f-87602700b860-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.112840 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6z4b\" (UniqueName: \"kubernetes.io/projected/46887851-3f0c-4edf-ad3f-87602700b860-kube-api-access-q6z4b\") on node \"crc\" DevicePath \"\"" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.436521 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.436565 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.493663 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.570848 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" event={"ID":"46887851-3f0c-4edf-ad3f-87602700b860","Type":"ContainerDied","Data":"1ca7fda1e985d652000371d95c1c11ed6c49f9e4bf95de0d3aa4867f3ff3654e"} Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.570903 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ca7fda1e985d652000371d95c1c11ed6c49f9e4bf95de0d3aa4867f3ff3654e" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.571834 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-zl4m6" Feb 17 15:35:58 crc kubenswrapper[4806]: I0217 15:35:58.634487 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.204667 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4"] Feb 17 15:35:59 crc kubenswrapper[4806]: E0217 15:35:59.205283 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46887851-3f0c-4edf-ad3f-87602700b860" containerName="swift-ring-rebalance" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.205301 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="46887851-3f0c-4edf-ad3f-87602700b860" containerName="swift-ring-rebalance" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.205456 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="46887851-3f0c-4edf-ad3f-87602700b860" containerName="swift-ring-rebalance" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.206433 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.212420 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5p4j2" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.218798 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4"] Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.376549 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-bundle\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.376597 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjchp\" (UniqueName: \"kubernetes.io/projected/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-kube-api-access-bjchp\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.376633 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-util\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.477673 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-bundle\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.477722 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjchp\" (UniqueName: \"kubernetes.io/projected/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-kube-api-access-bjchp\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.477777 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-util\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.478516 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-util\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.478543 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-bundle\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.505071 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjchp\" (UniqueName: \"kubernetes.io/projected/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-kube-api-access-bjchp\") pod \"00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.521695 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:35:59 crc kubenswrapper[4806]: I0217 15:35:59.989130 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4"] Feb 17 15:36:00 crc kubenswrapper[4806]: I0217 15:36:00.591871 4806 generic.go:334] "Generic (PLEG): container finished" podID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerID="588772bb20a1ea3b2491d70d4ec512095647a2a4025d2b4a5b0eb9f0f60be217" exitCode=0 Feb 17 15:36:00 crc kubenswrapper[4806]: I0217 15:36:00.592093 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" event={"ID":"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3","Type":"ContainerDied","Data":"588772bb20a1ea3b2491d70d4ec512095647a2a4025d2b4a5b0eb9f0f60be217"} Feb 17 15:36:00 crc kubenswrapper[4806]: I0217 15:36:00.592485 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" event={"ID":"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3","Type":"ContainerStarted","Data":"ffc8dbe8552a1d7dee5ca19389d207f2c3527393fc9c3327ce73d6b7d7208100"} Feb 17 15:36:01 crc kubenswrapper[4806]: I0217 15:36:01.362943 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:36:01 crc kubenswrapper[4806]: I0217 15:36:01.421181 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:36:01 crc kubenswrapper[4806]: I0217 15:36:01.613968 4806 generic.go:334] "Generic (PLEG): container finished" podID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerID="35c1d901c3a075c0de7fed007dbeed762955783399e0b6f62fac1b563e8f9318" exitCode=0 Feb 17 15:36:01 crc kubenswrapper[4806]: I0217 15:36:01.614308 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" event={"ID":"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3","Type":"ContainerDied","Data":"35c1d901c3a075c0de7fed007dbeed762955783399e0b6f62fac1b563e8f9318"} Feb 17 15:36:02 crc kubenswrapper[4806]: I0217 15:36:02.553845 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2x94t"] Feb 17 15:36:02 crc kubenswrapper[4806]: I0217 15:36:02.554605 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2x94t" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="registry-server" containerID="cri-o://baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454" gracePeriod=2 Feb 17 15:36:02 crc kubenswrapper[4806]: I0217 15:36:02.622079 4806 generic.go:334] "Generic (PLEG): container finished" podID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerID="bc2fdab67c0566709361a3deb33dd1bb1d69804d946510931eeb6911f2051bb4" exitCode=0 Feb 17 15:36:02 crc kubenswrapper[4806]: I0217 15:36:02.622118 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" event={"ID":"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3","Type":"ContainerDied","Data":"bc2fdab67c0566709361a3deb33dd1bb1d69804d946510931eeb6911f2051bb4"} Feb 17 15:36:02 crc kubenswrapper[4806]: I0217 15:36:02.952583 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvrtb"] Feb 17 15:36:02 crc kubenswrapper[4806]: I0217 15:36:02.952837 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zvrtb" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="registry-server" containerID="cri-o://68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a" gracePeriod=2 Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.551072 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.612503 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.634945 4806 generic.go:334] "Generic (PLEG): container finished" podID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerID="baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454" exitCode=0 Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.635006 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2x94t" event={"ID":"4aa0bac5-5565-409b-8f08-18bf764c2a09","Type":"ContainerDied","Data":"baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454"} Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.635037 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2x94t" event={"ID":"4aa0bac5-5565-409b-8f08-18bf764c2a09","Type":"ContainerDied","Data":"6d073f4ced84db72c69636cfcf2bb166f370024e528b59c434092af5d2ec6ab0"} Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.635055 4806 scope.go:117] "RemoveContainer" containerID="baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.635183 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2x94t" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.642333 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4wld\" (UniqueName: \"kubernetes.io/projected/83e7972c-0c91-4309-af07-16e729bb8c84-kube-api-access-j4wld\") pod \"83e7972c-0c91-4309-af07-16e729bb8c84\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.642483 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-catalog-content\") pod \"83e7972c-0c91-4309-af07-16e729bb8c84\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.642542 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-utilities\") pod \"83e7972c-0c91-4309-af07-16e729bb8c84\" (UID: \"83e7972c-0c91-4309-af07-16e729bb8c84\") " Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.644525 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-utilities" (OuterVolumeSpecName: "utilities") pod "83e7972c-0c91-4309-af07-16e729bb8c84" (UID: "83e7972c-0c91-4309-af07-16e729bb8c84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.648120 4806 generic.go:334] "Generic (PLEG): container finished" podID="83e7972c-0c91-4309-af07-16e729bb8c84" containerID="68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a" exitCode=0 Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.648528 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zvrtb" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.649314 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrtb" event={"ID":"83e7972c-0c91-4309-af07-16e729bb8c84","Type":"ContainerDied","Data":"68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a"} Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.649361 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zvrtb" event={"ID":"83e7972c-0c91-4309-af07-16e729bb8c84","Type":"ContainerDied","Data":"4d3784bd67ef46d94f1ff03e278aa692de42d5858e68be12993a3d7336206115"} Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.649687 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e7972c-0c91-4309-af07-16e729bb8c84-kube-api-access-j4wld" (OuterVolumeSpecName: "kube-api-access-j4wld") pod "83e7972c-0c91-4309-af07-16e729bb8c84" (UID: "83e7972c-0c91-4309-af07-16e729bb8c84"). InnerVolumeSpecName "kube-api-access-j4wld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.666889 4806 scope.go:117] "RemoveContainer" containerID="e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.709173 4806 scope.go:117] "RemoveContainer" containerID="ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.744499 4806 scope.go:117] "RemoveContainer" containerID="baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.747929 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-utilities\") pod \"4aa0bac5-5565-409b-8f08-18bf764c2a09\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.748024 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxlh7\" (UniqueName: \"kubernetes.io/projected/4aa0bac5-5565-409b-8f08-18bf764c2a09-kube-api-access-cxlh7\") pod \"4aa0bac5-5565-409b-8f08-18bf764c2a09\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.748046 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-catalog-content\") pod \"4aa0bac5-5565-409b-8f08-18bf764c2a09\" (UID: \"4aa0bac5-5565-409b-8f08-18bf764c2a09\") " Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.748333 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4wld\" (UniqueName: \"kubernetes.io/projected/83e7972c-0c91-4309-af07-16e729bb8c84-kube-api-access-j4wld\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.748345 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.749321 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-utilities" (OuterVolumeSpecName: "utilities") pod "4aa0bac5-5565-409b-8f08-18bf764c2a09" (UID: "4aa0bac5-5565-409b-8f08-18bf764c2a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.753637 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa0bac5-5565-409b-8f08-18bf764c2a09-kube-api-access-cxlh7" (OuterVolumeSpecName: "kube-api-access-cxlh7") pod "4aa0bac5-5565-409b-8f08-18bf764c2a09" (UID: "4aa0bac5-5565-409b-8f08-18bf764c2a09"). InnerVolumeSpecName "kube-api-access-cxlh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:03 crc kubenswrapper[4806]: E0217 15:36:03.759337 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454\": container with ID starting with baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454 not found: ID does not exist" containerID="baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.759389 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454"} err="failed to get container status \"baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454\": rpc error: code = NotFound desc = could not find container \"baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454\": container with ID starting with baa3b1ccae720ce66cb054fbd255f35c18c7798a8548ba7db59c6c27d2751454 not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.759465 4806 scope.go:117] "RemoveContainer" containerID="e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee" Feb 17 15:36:03 crc kubenswrapper[4806]: E0217 15:36:03.778534 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee\": container with ID starting with e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee not found: ID does not exist" containerID="e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.778581 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee"} err="failed to get container status \"e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee\": rpc error: code = NotFound desc = could not find container \"e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee\": container with ID starting with e6015e438f1b04db6b3fed57ad275061e507f737f6a464627f42c0087a7835ee not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.778610 4806 scope.go:117] "RemoveContainer" containerID="ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac" Feb 17 15:36:03 crc kubenswrapper[4806]: E0217 15:36:03.782349 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac\": container with ID starting with ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac not found: ID does not exist" containerID="ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.782446 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac"} err="failed to get container status \"ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac\": rpc error: code = NotFound desc = could not find container \"ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac\": container with ID starting with ecb87511ad00e8c301eabf0435468e353456e3b3f44ff94ce0dfdf78395a07ac not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.782478 4806 scope.go:117] "RemoveContainer" containerID="68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.794656 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83e7972c-0c91-4309-af07-16e729bb8c84" (UID: "83e7972c-0c91-4309-af07-16e729bb8c84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.814939 4806 scope.go:117] "RemoveContainer" containerID="5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.839090 4806 scope.go:117] "RemoveContainer" containerID="f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.843067 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa0bac5-5565-409b-8f08-18bf764c2a09" (UID: "4aa0bac5-5565-409b-8f08-18bf764c2a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.853344 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxlh7\" (UniqueName: \"kubernetes.io/projected/4aa0bac5-5565-409b-8f08-18bf764c2a09-kube-api-access-cxlh7\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.853377 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.853392 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83e7972c-0c91-4309-af07-16e729bb8c84-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.853497 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0bac5-5565-409b-8f08-18bf764c2a09-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.882351 4806 scope.go:117] "RemoveContainer" containerID="68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a" Feb 17 15:36:03 crc kubenswrapper[4806]: E0217 15:36:03.882845 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a\": container with ID starting with 68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a not found: ID does not exist" containerID="68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.882874 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a"} err="failed to get container status \"68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a\": rpc error: code = NotFound desc = could not find container \"68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a\": container with ID starting with 68cc61fc0ed001b51554d4e4e01c973d1ace3d3dcc25092f77341b21a497505a not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.882898 4806 scope.go:117] "RemoveContainer" containerID="5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f" Feb 17 15:36:03 crc kubenswrapper[4806]: E0217 15:36:03.883158 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f\": container with ID starting with 5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f not found: ID does not exist" containerID="5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.883182 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f"} err="failed to get container status \"5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f\": rpc error: code = NotFound desc = could not find container \"5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f\": container with ID starting with 5363a9d2c19de8f8f71644becd2f011299ba9e45cda44db560fa69f58819c27f not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.883198 4806 scope.go:117] "RemoveContainer" containerID="f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289" Feb 17 15:36:03 crc kubenswrapper[4806]: E0217 15:36:03.883423 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289\": container with ID starting with f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289 not found: ID does not exist" containerID="f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.883445 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289"} err="failed to get container status \"f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289\": rpc error: code = NotFound desc = could not find container \"f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289\": container with ID starting with f7754a42fed0ae5d005d85abe39e6837d7d8624bbb5e19802cafaf0f7dea3289 not found: ID does not exist" Feb 17 15:36:03 crc kubenswrapper[4806]: I0217 15:36:03.941624 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.022463 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2x94t"] Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.028267 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2x94t"] Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.033473 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zvrtb"] Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.037767 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zvrtb"] Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.058907 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-bundle\") pod \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.059048 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjchp\" (UniqueName: \"kubernetes.io/projected/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-kube-api-access-bjchp\") pod \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.059097 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-util\") pod \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\" (UID: \"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3\") " Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.060138 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-bundle" (OuterVolumeSpecName: "bundle") pod "0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" (UID: "0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.063898 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-kube-api-access-bjchp" (OuterVolumeSpecName: "kube-api-access-bjchp") pod "0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" (UID: "0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3"). InnerVolumeSpecName "kube-api-access-bjchp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.092347 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-util" (OuterVolumeSpecName: "util") pod "0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" (UID: "0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.162144 4806 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.162188 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjchp\" (UniqueName: \"kubernetes.io/projected/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-kube-api-access-bjchp\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.162201 4806 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3-util\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.661947 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" event={"ID":"0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3","Type":"ContainerDied","Data":"ffc8dbe8552a1d7dee5ca19389d207f2c3527393fc9c3327ce73d6b7d7208100"} Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.662032 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc8dbe8552a1d7dee5ca19389d207f2c3527393fc9c3327ce73d6b7d7208100" Feb 17 15:36:04 crc kubenswrapper[4806]: I0217 15:36:04.661964 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4" Feb 17 15:36:05 crc kubenswrapper[4806]: I0217 15:36:05.177830 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" path="/var/lib/kubelet/pods/4aa0bac5-5565-409b-8f08-18bf764c2a09/volumes" Feb 17 15:36:05 crc kubenswrapper[4806]: I0217 15:36:05.179071 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" path="/var/lib/kubelet/pods/83e7972c-0c91-4309-af07-16e729bb8c84/volumes" Feb 17 15:36:06 crc kubenswrapper[4806]: I0217 15:36:06.479004 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:36:06 crc kubenswrapper[4806]: I0217 15:36:06.480821 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" Feb 17 15:36:07 crc kubenswrapper[4806]: I0217 15:36:07.753723 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsxl6"] Feb 17 15:36:07 crc kubenswrapper[4806]: I0217 15:36:07.755659 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tsxl6" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="registry-server" containerID="cri-o://f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558" gracePeriod=2 Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.232486 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.327487 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-catalog-content\") pod \"5b74ca1f-39bc-46b3-858f-311bee5fc691\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.327698 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-utilities\") pod \"5b74ca1f-39bc-46b3-858f-311bee5fc691\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.327732 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j2nt\" (UniqueName: \"kubernetes.io/projected/5b74ca1f-39bc-46b3-858f-311bee5fc691-kube-api-access-2j2nt\") pod \"5b74ca1f-39bc-46b3-858f-311bee5fc691\" (UID: \"5b74ca1f-39bc-46b3-858f-311bee5fc691\") " Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.330164 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-utilities" (OuterVolumeSpecName: "utilities") pod "5b74ca1f-39bc-46b3-858f-311bee5fc691" (UID: "5b74ca1f-39bc-46b3-858f-311bee5fc691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.336511 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b74ca1f-39bc-46b3-858f-311bee5fc691-kube-api-access-2j2nt" (OuterVolumeSpecName: "kube-api-access-2j2nt") pod "5b74ca1f-39bc-46b3-858f-311bee5fc691" (UID: "5b74ca1f-39bc-46b3-858f-311bee5fc691"). InnerVolumeSpecName "kube-api-access-2j2nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.391393 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b74ca1f-39bc-46b3-858f-311bee5fc691" (UID: "5b74ca1f-39bc-46b3-858f-311bee5fc691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.429644 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.429929 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b74ca1f-39bc-46b3-858f-311bee5fc691-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.430031 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j2nt\" (UniqueName: \"kubernetes.io/projected/5b74ca1f-39bc-46b3-858f-311bee5fc691-kube-api-access-2j2nt\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.695890 4806 generic.go:334] "Generic (PLEG): container finished" podID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerID="f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558" exitCode=0 Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.695968 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerDied","Data":"f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558"} Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.696010 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tsxl6" event={"ID":"5b74ca1f-39bc-46b3-858f-311bee5fc691","Type":"ContainerDied","Data":"8fb2d86fa32038d0400cca0cbba45e3e0f1d944c0f6f8b810a84b2ef3f355fcf"} Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.696034 4806 scope.go:117] "RemoveContainer" containerID="f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.696306 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tsxl6" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.721600 4806 scope.go:117] "RemoveContainer" containerID="7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.740455 4806 scope.go:117] "RemoveContainer" containerID="b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.748904 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tsxl6"] Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.756768 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tsxl6"] Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.775558 4806 scope.go:117] "RemoveContainer" containerID="f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558" Feb 17 15:36:08 crc kubenswrapper[4806]: E0217 15:36:08.776335 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558\": container with ID starting with f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558 not found: ID does not exist" containerID="f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.776386 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558"} err="failed to get container status \"f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558\": rpc error: code = NotFound desc = could not find container \"f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558\": container with ID starting with f3611b0329dba851c6e9a71df641c31a973df7e5fe807666bdb6bd66ae162558 not found: ID does not exist" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.776433 4806 scope.go:117] "RemoveContainer" containerID="7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b" Feb 17 15:36:08 crc kubenswrapper[4806]: E0217 15:36:08.776943 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b\": container with ID starting with 7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b not found: ID does not exist" containerID="7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.776981 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b"} err="failed to get container status \"7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b\": rpc error: code = NotFound desc = could not find container \"7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b\": container with ID starting with 7cd6c900b3cd09e176f7ed26f3c6e6c05079457f8c4a4383c72fb75b7a00063b not found: ID does not exist" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.777081 4806 scope.go:117] "RemoveContainer" containerID="b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a" Feb 17 15:36:08 crc kubenswrapper[4806]: E0217 15:36:08.777499 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a\": container with ID starting with b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a not found: ID does not exist" containerID="b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a" Feb 17 15:36:08 crc kubenswrapper[4806]: I0217 15:36:08.777533 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a"} err="failed to get container status \"b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a\": rpc error: code = NotFound desc = could not find container \"b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a\": container with ID starting with b5964a263a809748f4e3d0946ef34b7e86d9ac628f7d3ea9a66526632bd0892a not found: ID does not exist" Feb 17 15:36:09 crc kubenswrapper[4806]: I0217 15:36:09.243124 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" path="/var/lib/kubelet/pods/5b74ca1f-39bc-46b3-858f-311bee5fc691/volumes" Feb 17 15:36:11 crc kubenswrapper[4806]: I0217 15:36:11.472845 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:36:11 crc kubenswrapper[4806]: I0217 15:36:11.484234 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/83699dfd-16c6-425d-b761-26b3635984ae-etc-swift\") pod \"swift-storage-0\" (UID: \"83699dfd-16c6-425d-b761-26b3635984ae\") " pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:36:11 crc kubenswrapper[4806]: I0217 15:36:11.718831 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Feb 17 15:36:12 crc kubenswrapper[4806]: I0217 15:36:12.183925 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Feb 17 15:36:12 crc kubenswrapper[4806]: W0217 15:36:12.190116 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83699dfd_16c6_425d_b761_26b3635984ae.slice/crio-7863ab3b2eb537f63c6edf2d59c5aef468a88989eafe7439153d57f621c84b1a WatchSource:0}: Error finding container 7863ab3b2eb537f63c6edf2d59c5aef468a88989eafe7439153d57f621c84b1a: Status 404 returned error can't find the container with id 7863ab3b2eb537f63c6edf2d59c5aef468a88989eafe7439153d57f621c84b1a Feb 17 15:36:12 crc kubenswrapper[4806]: I0217 15:36:12.734693 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"7863ab3b2eb537f63c6edf2d59c5aef468a88989eafe7439153d57f621c84b1a"} Feb 17 15:36:14 crc kubenswrapper[4806]: I0217 15:36:14.753113 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"5ca2628abaa563731d2d11b5f684f7f61ad133f9ee5e1bd29ee2ca90d5a1cc99"} Feb 17 15:36:14 crc kubenswrapper[4806]: I0217 15:36:14.753855 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"36025a583aab4c0b852fd57f70b50c71042e5d7f452ad91dbc2f4582cd59beec"} Feb 17 15:36:14 crc kubenswrapper[4806]: I0217 15:36:14.753870 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"ad37cb54a64c04e48cf96bf69bfed3bb982bdf71ba782963b86e5a394d7f890c"} Feb 17 15:36:14 crc kubenswrapper[4806]: I0217 15:36:14.753882 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"1e1e1e428177047bc0633a763505fc4bdfb7f67572a6d3b2706e8c610b697de5"} Feb 17 15:36:15 crc kubenswrapper[4806]: I0217 15:36:15.764879 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"49d71d838761db81832f60ac04e648dd8ae33c277cd6e4122e9fb8ba40a1981d"} Feb 17 15:36:15 crc kubenswrapper[4806]: I0217 15:36:15.765255 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"7ab88d3a71a7a889c2aa669a4811d974e2aae496929e160384502f6aa860bcc3"} Feb 17 15:36:16 crc kubenswrapper[4806]: I0217 15:36:16.791394 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"47cf0f3af99565ca8b64498def2ab120210506c640f341f2ffd54ecfce1048b2"} Feb 17 15:36:16 crc kubenswrapper[4806]: I0217 15:36:16.791500 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"14ea3bd1ebd8fdb6eec4962361202a57fb610b2a1c11f708a031677aefd71572"} Feb 17 15:36:17 crc kubenswrapper[4806]: I0217 15:36:17.802886 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"480d0c2f3151ac47ddc809a9b82e8b0a15be2cdf1bfac043713505d4a2e44050"} Feb 17 15:36:17 crc kubenswrapper[4806]: I0217 15:36:17.803341 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"4dbb10c7152326374792ba9fd0d0fa7741ea15c669d43b738723b0ed59f2d78e"} Feb 17 15:36:17 crc kubenswrapper[4806]: I0217 15:36:17.803356 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"167a803974384b55ef0389ff7a4ff580fdaf5e16562641fb7292d1498379b497"} Feb 17 15:36:18 crc kubenswrapper[4806]: I0217 15:36:18.818814 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"fdcb4ddc0ecc1b6c4094f667d02f5dcd131f97ddcd09299af866e544a8dfbdab"} Feb 17 15:36:18 crc kubenswrapper[4806]: I0217 15:36:18.818893 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"3c9308fdd13cd581e761a94ccb63d0f66ad4a1bb028da0d2e7b099c4bea0bdec"} Feb 17 15:36:18 crc kubenswrapper[4806]: I0217 15:36:18.818914 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"b19f8a16c1454af41365e409cb07490758ac59be3ecb838c39b58a9363784ff9"} Feb 17 15:36:18 crc kubenswrapper[4806]: I0217 15:36:18.818933 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"83699dfd-16c6-425d-b761-26b3635984ae","Type":"ContainerStarted","Data":"a5186add15f476d41c93df7b9b01a597a3c6f9ff7d5c4f18b58ccfb055bdb513"} Feb 17 15:36:18 crc kubenswrapper[4806]: I0217 15:36:18.849977 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=35.87459862 podStartE2EDuration="40.84995723s" podCreationTimestamp="2026-02-17 15:35:38 +0000 UTC" firstStartedPulling="2026-02-17 15:36:12.193278386 +0000 UTC m=+933.723908807" lastFinishedPulling="2026-02-17 15:36:17.168637006 +0000 UTC m=+938.699267417" observedRunningTime="2026-02-17 15:36:18.847974511 +0000 UTC m=+940.378604942" watchObservedRunningTime="2026-02-17 15:36:18.84995723 +0000 UTC m=+940.380587651" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.774439 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79"] Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775599 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775616 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775628 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="extract-utilities" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775635 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="extract-utilities" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775645 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="extract-content" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775654 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="extract-content" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775665 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="extract-content" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775672 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="extract-content" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775683 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerName="pull" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775691 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerName="pull" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775700 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="extract-utilities" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775708 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="extract-utilities" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775718 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775725 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775738 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775745 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775761 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerName="extract" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775769 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerName="extract" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775779 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="extract-utilities" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775785 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="extract-utilities" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775797 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerName="util" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775804 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerName="util" Feb 17 15:36:23 crc kubenswrapper[4806]: E0217 15:36:23.775817 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="extract-content" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775825 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="extract-content" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.775986 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b74ca1f-39bc-46b3-858f-311bee5fc691" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.776004 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa0bac5-5565-409b-8f08-18bf764c2a09" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.776014 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e7972c-0c91-4309-af07-16e729bb8c84" containerName="registry-server" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.776029 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3" containerName="extract" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.776479 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.778775 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-bxkt6" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.779142 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.791537 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79"] Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.876101 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbms\" (UniqueName: \"kubernetes.io/projected/af129dac-8ce3-4199-85d4-e07ad5adf02b-kube-api-access-kfbms\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.876612 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af129dac-8ce3-4199-85d4-e07ad5adf02b-webhook-cert\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.876646 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af129dac-8ce3-4199-85d4-e07ad5adf02b-apiservice-cert\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.978373 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af129dac-8ce3-4199-85d4-e07ad5adf02b-apiservice-cert\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.978537 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbms\" (UniqueName: \"kubernetes.io/projected/af129dac-8ce3-4199-85d4-e07ad5adf02b-kube-api-access-kfbms\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.978582 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af129dac-8ce3-4199-85d4-e07ad5adf02b-webhook-cert\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.988035 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af129dac-8ce3-4199-85d4-e07ad5adf02b-webhook-cert\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.992166 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af129dac-8ce3-4199-85d4-e07ad5adf02b-apiservice-cert\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:23 crc kubenswrapper[4806]: I0217 15:36:23.996011 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbms\" (UniqueName: \"kubernetes.io/projected/af129dac-8ce3-4199-85d4-e07ad5adf02b-kube-api-access-kfbms\") pod \"glance-operator-controller-manager-84d4cfd9dd-bwz79\" (UID: \"af129dac-8ce3-4199-85d4-e07ad5adf02b\") " pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:24 crc kubenswrapper[4806]: I0217 15:36:24.098849 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:24 crc kubenswrapper[4806]: I0217 15:36:24.536626 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79"] Feb 17 15:36:24 crc kubenswrapper[4806]: I0217 15:36:24.874782 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" event={"ID":"af129dac-8ce3-4199-85d4-e07ad5adf02b","Type":"ContainerStarted","Data":"346314097d917e67c4c93b87f99d57f85efbf82e13c96498897ce0b3c2b106bc"} Feb 17 15:36:26 crc kubenswrapper[4806]: I0217 15:36:26.891309 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" event={"ID":"af129dac-8ce3-4199-85d4-e07ad5adf02b","Type":"ContainerStarted","Data":"09ca258b9d21022598d85c364f05f07e9410376b0db8aa780af17bf0b09fe747"} Feb 17 15:36:26 crc kubenswrapper[4806]: I0217 15:36:26.892206 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:26 crc kubenswrapper[4806]: I0217 15:36:26.915919 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" podStartSLOduration=2.628251793 podStartE2EDuration="3.915897329s" podCreationTimestamp="2026-02-17 15:36:23 +0000 UTC" firstStartedPulling="2026-02-17 15:36:24.553060157 +0000 UTC m=+946.083690568" lastFinishedPulling="2026-02-17 15:36:25.840705693 +0000 UTC m=+947.371336104" observedRunningTime="2026-02-17 15:36:26.909071451 +0000 UTC m=+948.439701872" watchObservedRunningTime="2026-02-17 15:36:26.915897329 +0000 UTC m=+948.446527730" Feb 17 15:36:34 crc kubenswrapper[4806]: I0217 15:36:34.105893 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84d4cfd9dd-bwz79" Feb 17 15:36:35 crc kubenswrapper[4806]: I0217 15:36:35.961395 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fqqr8"] Feb 17 15:36:35 crc kubenswrapper[4806]: I0217 15:36:35.963371 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:35 crc kubenswrapper[4806]: I0217 15:36:35.977461 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqqr8"] Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.061595 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn5z8\" (UniqueName: \"kubernetes.io/projected/3ad06650-7efc-4845-af41-e9b84edef7c5-kube-api-access-fn5z8\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.062081 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-utilities\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.062117 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-catalog-content\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.163532 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn5z8\" (UniqueName: \"kubernetes.io/projected/3ad06650-7efc-4845-af41-e9b84edef7c5-kube-api-access-fn5z8\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.163618 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-utilities\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.163669 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-catalog-content\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.164214 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-catalog-content\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.164313 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-utilities\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.184971 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn5z8\" (UniqueName: \"kubernetes.io/projected/3ad06650-7efc-4845-af41-e9b84edef7c5-kube-api-access-fn5z8\") pod \"redhat-marketplace-fqqr8\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.284695 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.741686 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqqr8"] Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.981125 4806 generic.go:334] "Generic (PLEG): container finished" podID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerID="bfc57ea49c21927a0bbde97ee158672303228bc513815d44643bc835367b8449" exitCode=0 Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.981177 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqqr8" event={"ID":"3ad06650-7efc-4845-af41-e9b84edef7c5","Type":"ContainerDied","Data":"bfc57ea49c21927a0bbde97ee158672303228bc513815d44643bc835367b8449"} Feb 17 15:36:36 crc kubenswrapper[4806]: I0217 15:36:36.981209 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqqr8" event={"ID":"3ad06650-7efc-4845-af41-e9b84edef7c5","Type":"ContainerStarted","Data":"1efebb42c4b75277545bb2507f205518e2046fbcbed2cb01500488846c5b63da"} Feb 17 15:36:37 crc kubenswrapper[4806]: I0217 15:36:37.990202 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqqr8" event={"ID":"3ad06650-7efc-4845-af41-e9b84edef7c5","Type":"ContainerStarted","Data":"fb4c5b2e45b77df3510efb0c9970d1ce3117b9cd3e3a6a5d6eb5a7c8ea6a5023"} Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.001366 4806 generic.go:334] "Generic (PLEG): container finished" podID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerID="fb4c5b2e45b77df3510efb0c9970d1ce3117b9cd3e3a6a5d6eb5a7c8ea6a5023" exitCode=0 Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.001434 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqqr8" event={"ID":"3ad06650-7efc-4845-af41-e9b84edef7c5","Type":"ContainerDied","Data":"fb4c5b2e45b77df3510efb0c9970d1ce3117b9cd3e3a6a5d6eb5a7c8ea6a5023"} Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.577449 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.578692 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.580950 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-dfdfw" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.582458 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.582641 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.591007 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.597725 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.648181 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-qbtnx"] Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.649036 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.656604 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-f248-account-create-update-7lszd"] Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.657435 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.658851 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.663760 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f248-account-create-update-7lszd"] Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.669231 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-qbtnx"] Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.711896 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config-secret\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.711946 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvmp\" (UniqueName: \"kubernetes.io/projected/c706f003-5196-40be-8d86-015f57f65c3f-kube-api-access-tnvmp\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.711968 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.711992 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-scripts\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.813450 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc8xx\" (UniqueName: \"kubernetes.io/projected/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-kube-api-access-sc8xx\") pod \"glance-f248-account-create-update-7lszd\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.813508 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwn9q\" (UniqueName: \"kubernetes.io/projected/84493b4d-1972-4ba3-b1ca-e48023412343-kube-api-access-lwn9q\") pod \"glance-db-create-qbtnx\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.813550 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84493b4d-1972-4ba3-b1ca-e48023412343-operator-scripts\") pod \"glance-db-create-qbtnx\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.813725 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config-secret\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.813804 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-operator-scripts\") pod \"glance-f248-account-create-update-7lszd\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.813891 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvmp\" (UniqueName: \"kubernetes.io/projected/c706f003-5196-40be-8d86-015f57f65c3f-kube-api-access-tnvmp\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.813943 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.814017 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-scripts\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.815099 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-scripts\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.815491 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.824003 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config-secret\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.834258 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvmp\" (UniqueName: \"kubernetes.io/projected/c706f003-5196-40be-8d86-015f57f65c3f-kube-api-access-tnvmp\") pod \"openstackclient\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.896071 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.914872 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwn9q\" (UniqueName: \"kubernetes.io/projected/84493b4d-1972-4ba3-b1ca-e48023412343-kube-api-access-lwn9q\") pod \"glance-db-create-qbtnx\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.914956 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84493b4d-1972-4ba3-b1ca-e48023412343-operator-scripts\") pod \"glance-db-create-qbtnx\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.915008 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-operator-scripts\") pod \"glance-f248-account-create-update-7lszd\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.915081 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc8xx\" (UniqueName: \"kubernetes.io/projected/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-kube-api-access-sc8xx\") pod \"glance-f248-account-create-update-7lszd\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.915785 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84493b4d-1972-4ba3-b1ca-e48023412343-operator-scripts\") pod \"glance-db-create-qbtnx\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.916084 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-operator-scripts\") pod \"glance-f248-account-create-update-7lszd\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.936192 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc8xx\" (UniqueName: \"kubernetes.io/projected/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-kube-api-access-sc8xx\") pod \"glance-f248-account-create-update-7lszd\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.936974 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwn9q\" (UniqueName: \"kubernetes.io/projected/84493b4d-1972-4ba3-b1ca-e48023412343-kube-api-access-lwn9q\") pod \"glance-db-create-qbtnx\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.967708 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:39 crc kubenswrapper[4806]: I0217 15:36:39.977575 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:40 crc kubenswrapper[4806]: I0217 15:36:40.023274 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqqr8" event={"ID":"3ad06650-7efc-4845-af41-e9b84edef7c5","Type":"ContainerStarted","Data":"5f8f4d1832eec50e75deb133e54581497c666cda43dd81cf991935e24d0821df"} Feb 17 15:36:40 crc kubenswrapper[4806]: I0217 15:36:40.046142 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fqqr8" podStartSLOduration=2.56810951 podStartE2EDuration="5.046121001s" podCreationTimestamp="2026-02-17 15:36:35 +0000 UTC" firstStartedPulling="2026-02-17 15:36:36.982831666 +0000 UTC m=+958.513462087" lastFinishedPulling="2026-02-17 15:36:39.460843167 +0000 UTC m=+960.991473578" observedRunningTime="2026-02-17 15:36:40.045347652 +0000 UTC m=+961.575978083" watchObservedRunningTime="2026-02-17 15:36:40.046121001 +0000 UTC m=+961.576751422" Feb 17 15:36:40 crc kubenswrapper[4806]: I0217 15:36:40.166942 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:36:40 crc kubenswrapper[4806]: W0217 15:36:40.188784 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc706f003_5196_40be_8d86_015f57f65c3f.slice/crio-682a9037a7c6f81f52f1d47645b8c8157c9b4c848035cfaa23a11160f836af7b WatchSource:0}: Error finding container 682a9037a7c6f81f52f1d47645b8c8157c9b4c848035cfaa23a11160f836af7b: Status 404 returned error can't find the container with id 682a9037a7c6f81f52f1d47645b8c8157c9b4c848035cfaa23a11160f836af7b Feb 17 15:36:40 crc kubenswrapper[4806]: I0217 15:36:40.477873 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-qbtnx"] Feb 17 15:36:40 crc kubenswrapper[4806]: I0217 15:36:40.553517 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f248-account-create-update-7lszd"] Feb 17 15:36:41 crc kubenswrapper[4806]: I0217 15:36:41.034769 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"c706f003-5196-40be-8d86-015f57f65c3f","Type":"ContainerStarted","Data":"682a9037a7c6f81f52f1d47645b8c8157c9b4c848035cfaa23a11160f836af7b"} Feb 17 15:36:41 crc kubenswrapper[4806]: I0217 15:36:41.038582 4806 generic.go:334] "Generic (PLEG): container finished" podID="84493b4d-1972-4ba3-b1ca-e48023412343" containerID="9424863c48cd7cc36b24b3e125720d884aadfc81b5b4104dd24669e943dfd8e9" exitCode=0 Feb 17 15:36:41 crc kubenswrapper[4806]: I0217 15:36:41.038666 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-qbtnx" event={"ID":"84493b4d-1972-4ba3-b1ca-e48023412343","Type":"ContainerDied","Data":"9424863c48cd7cc36b24b3e125720d884aadfc81b5b4104dd24669e943dfd8e9"} Feb 17 15:36:41 crc kubenswrapper[4806]: I0217 15:36:41.038701 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-qbtnx" event={"ID":"84493b4d-1972-4ba3-b1ca-e48023412343","Type":"ContainerStarted","Data":"612d88ef6663d6e63c6fd09ce7ee8ce581a9f6fb0bafbe8557694f163f2b4aa4"} Feb 17 15:36:41 crc kubenswrapper[4806]: I0217 15:36:41.040543 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" event={"ID":"fd9135ba-f2fb-4bd2-94d8-5c60d6797991","Type":"ContainerStarted","Data":"d54afb0a88d904663049b0088bbcac9155d64331fd532ed1658eb4d6a1f6c4ee"} Feb 17 15:36:41 crc kubenswrapper[4806]: I0217 15:36:41.040587 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" event={"ID":"fd9135ba-f2fb-4bd2-94d8-5c60d6797991","Type":"ContainerStarted","Data":"f72b004cdac98fbc158cdefc07cba40639b56594a89a4acf1b97c8dfacaa17c6"} Feb 17 15:36:41 crc kubenswrapper[4806]: I0217 15:36:41.069736 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" podStartSLOduration=2.069717285 podStartE2EDuration="2.069717285s" podCreationTimestamp="2026-02-17 15:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:36:41.064877846 +0000 UTC m=+962.595508267" watchObservedRunningTime="2026-02-17 15:36:41.069717285 +0000 UTC m=+962.600347696" Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.058588 4806 generic.go:334] "Generic (PLEG): container finished" podID="fd9135ba-f2fb-4bd2-94d8-5c60d6797991" containerID="d54afb0a88d904663049b0088bbcac9155d64331fd532ed1658eb4d6a1f6c4ee" exitCode=0 Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.058654 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" event={"ID":"fd9135ba-f2fb-4bd2-94d8-5c60d6797991","Type":"ContainerDied","Data":"d54afb0a88d904663049b0088bbcac9155d64331fd532ed1658eb4d6a1f6c4ee"} Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.376607 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.552631 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84493b4d-1972-4ba3-b1ca-e48023412343-operator-scripts\") pod \"84493b4d-1972-4ba3-b1ca-e48023412343\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.552793 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwn9q\" (UniqueName: \"kubernetes.io/projected/84493b4d-1972-4ba3-b1ca-e48023412343-kube-api-access-lwn9q\") pod \"84493b4d-1972-4ba3-b1ca-e48023412343\" (UID: \"84493b4d-1972-4ba3-b1ca-e48023412343\") " Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.553494 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84493b4d-1972-4ba3-b1ca-e48023412343-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84493b4d-1972-4ba3-b1ca-e48023412343" (UID: "84493b4d-1972-4ba3-b1ca-e48023412343"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.559574 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84493b4d-1972-4ba3-b1ca-e48023412343-kube-api-access-lwn9q" (OuterVolumeSpecName: "kube-api-access-lwn9q") pod "84493b4d-1972-4ba3-b1ca-e48023412343" (UID: "84493b4d-1972-4ba3-b1ca-e48023412343"). InnerVolumeSpecName "kube-api-access-lwn9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.654178 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84493b4d-1972-4ba3-b1ca-e48023412343-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:42 crc kubenswrapper[4806]: I0217 15:36:42.654214 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwn9q\" (UniqueName: \"kubernetes.io/projected/84493b4d-1972-4ba3-b1ca-e48023412343-kube-api-access-lwn9q\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.071035 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qbtnx" Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.071129 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-qbtnx" event={"ID":"84493b4d-1972-4ba3-b1ca-e48023412343","Type":"ContainerDied","Data":"612d88ef6663d6e63c6fd09ce7ee8ce581a9f6fb0bafbe8557694f163f2b4aa4"} Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.071177 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="612d88ef6663d6e63c6fd09ce7ee8ce581a9f6fb0bafbe8557694f163f2b4aa4" Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.396541 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.567806 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-operator-scripts\") pod \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.569809 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc8xx\" (UniqueName: \"kubernetes.io/projected/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-kube-api-access-sc8xx\") pod \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\" (UID: \"fd9135ba-f2fb-4bd2-94d8-5c60d6797991\") " Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.569675 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd9135ba-f2fb-4bd2-94d8-5c60d6797991" (UID: "fd9135ba-f2fb-4bd2-94d8-5c60d6797991"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.575466 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-kube-api-access-sc8xx" (OuterVolumeSpecName: "kube-api-access-sc8xx") pod "fd9135ba-f2fb-4bd2-94d8-5c60d6797991" (UID: "fd9135ba-f2fb-4bd2-94d8-5c60d6797991"). InnerVolumeSpecName "kube-api-access-sc8xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.672099 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:43 crc kubenswrapper[4806]: I0217 15:36:43.672136 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc8xx\" (UniqueName: \"kubernetes.io/projected/fd9135ba-f2fb-4bd2-94d8-5c60d6797991-kube-api-access-sc8xx\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.080803 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" event={"ID":"fd9135ba-f2fb-4bd2-94d8-5c60d6797991","Type":"ContainerDied","Data":"f72b004cdac98fbc158cdefc07cba40639b56594a89a4acf1b97c8dfacaa17c6"} Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.080857 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f72b004cdac98fbc158cdefc07cba40639b56594a89a4acf1b97c8dfacaa17c6" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.080926 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f248-account-create-update-7lszd" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.857509 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-s4pmr"] Feb 17 15:36:44 crc kubenswrapper[4806]: E0217 15:36:44.858196 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9135ba-f2fb-4bd2-94d8-5c60d6797991" containerName="mariadb-account-create-update" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.858208 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9135ba-f2fb-4bd2-94d8-5c60d6797991" containerName="mariadb-account-create-update" Feb 17 15:36:44 crc kubenswrapper[4806]: E0217 15:36:44.858219 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84493b4d-1972-4ba3-b1ca-e48023412343" containerName="mariadb-database-create" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.858225 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="84493b4d-1972-4ba3-b1ca-e48023412343" containerName="mariadb-database-create" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.858341 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9135ba-f2fb-4bd2-94d8-5c60d6797991" containerName="mariadb-account-create-update" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.858352 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="84493b4d-1972-4ba3-b1ca-e48023412343" containerName="mariadb-database-create" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.858923 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.860535 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.861200 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-q4wvc" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.870516 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s4pmr"] Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.993219 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-db-sync-config-data\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.993289 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cp8x\" (UniqueName: \"kubernetes.io/projected/590c794e-303e-4a1b-bb48-846372bf5d53-kube-api-access-9cp8x\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:44 crc kubenswrapper[4806]: I0217 15:36:44.993337 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-config-data\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:45 crc kubenswrapper[4806]: I0217 15:36:45.094685 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-db-sync-config-data\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:45 crc kubenswrapper[4806]: I0217 15:36:45.094734 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cp8x\" (UniqueName: \"kubernetes.io/projected/590c794e-303e-4a1b-bb48-846372bf5d53-kube-api-access-9cp8x\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:45 crc kubenswrapper[4806]: I0217 15:36:45.094767 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-config-data\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:45 crc kubenswrapper[4806]: I0217 15:36:45.101241 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-db-sync-config-data\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:45 crc kubenswrapper[4806]: I0217 15:36:45.102145 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-config-data\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:45 crc kubenswrapper[4806]: I0217 15:36:45.130882 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cp8x\" (UniqueName: \"kubernetes.io/projected/590c794e-303e-4a1b-bb48-846372bf5d53-kube-api-access-9cp8x\") pod \"glance-db-sync-s4pmr\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:45 crc kubenswrapper[4806]: I0217 15:36:45.224538 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:36:46 crc kubenswrapper[4806]: I0217 15:36:46.285291 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:46 crc kubenswrapper[4806]: I0217 15:36:46.285769 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:46 crc kubenswrapper[4806]: I0217 15:36:46.332623 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:47 crc kubenswrapper[4806]: I0217 15:36:47.177098 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:48 crc kubenswrapper[4806]: I0217 15:36:48.902159 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s4pmr"] Feb 17 15:36:48 crc kubenswrapper[4806]: W0217 15:36:48.906483 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod590c794e_303e_4a1b_bb48_846372bf5d53.slice/crio-0b98196fab561776375e24f6cb01a983a9e307c4eaea5fbedc88f8e36ff107e1 WatchSource:0}: Error finding container 0b98196fab561776375e24f6cb01a983a9e307c4eaea5fbedc88f8e36ff107e1: Status 404 returned error can't find the container with id 0b98196fab561776375e24f6cb01a983a9e307c4eaea5fbedc88f8e36ff107e1 Feb 17 15:36:49 crc kubenswrapper[4806]: I0217 15:36:49.116580 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s4pmr" event={"ID":"590c794e-303e-4a1b-bb48-846372bf5d53","Type":"ContainerStarted","Data":"0b98196fab561776375e24f6cb01a983a9e307c4eaea5fbedc88f8e36ff107e1"} Feb 17 15:36:49 crc kubenswrapper[4806]: I0217 15:36:49.117853 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"c706f003-5196-40be-8d86-015f57f65c3f","Type":"ContainerStarted","Data":"d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324"} Feb 17 15:36:49 crc kubenswrapper[4806]: I0217 15:36:49.141758 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.826495257 podStartE2EDuration="10.141724854s" podCreationTimestamp="2026-02-17 15:36:39 +0000 UTC" firstStartedPulling="2026-02-17 15:36:40.191275681 +0000 UTC m=+961.721906092" lastFinishedPulling="2026-02-17 15:36:48.506505278 +0000 UTC m=+970.037135689" observedRunningTime="2026-02-17 15:36:49.134507876 +0000 UTC m=+970.665138287" watchObservedRunningTime="2026-02-17 15:36:49.141724854 +0000 UTC m=+970.672355295" Feb 17 15:36:49 crc kubenswrapper[4806]: I0217 15:36:49.957341 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqqr8"] Feb 17 15:36:49 crc kubenswrapper[4806]: I0217 15:36:49.957778 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fqqr8" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="registry-server" containerID="cri-o://5f8f4d1832eec50e75deb133e54581497c666cda43dd81cf991935e24d0821df" gracePeriod=2 Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.131822 4806 generic.go:334] "Generic (PLEG): container finished" podID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerID="5f8f4d1832eec50e75deb133e54581497c666cda43dd81cf991935e24d0821df" exitCode=0 Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.132819 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqqr8" event={"ID":"3ad06650-7efc-4845-af41-e9b84edef7c5","Type":"ContainerDied","Data":"5f8f4d1832eec50e75deb133e54581497c666cda43dd81cf991935e24d0821df"} Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.397766 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.572957 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-utilities\") pod \"3ad06650-7efc-4845-af41-e9b84edef7c5\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.573553 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn5z8\" (UniqueName: \"kubernetes.io/projected/3ad06650-7efc-4845-af41-e9b84edef7c5-kube-api-access-fn5z8\") pod \"3ad06650-7efc-4845-af41-e9b84edef7c5\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.573662 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-catalog-content\") pod \"3ad06650-7efc-4845-af41-e9b84edef7c5\" (UID: \"3ad06650-7efc-4845-af41-e9b84edef7c5\") " Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.577266 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-utilities" (OuterVolumeSpecName: "utilities") pod "3ad06650-7efc-4845-af41-e9b84edef7c5" (UID: "3ad06650-7efc-4845-af41-e9b84edef7c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.583265 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad06650-7efc-4845-af41-e9b84edef7c5-kube-api-access-fn5z8" (OuterVolumeSpecName: "kube-api-access-fn5z8") pod "3ad06650-7efc-4845-af41-e9b84edef7c5" (UID: "3ad06650-7efc-4845-af41-e9b84edef7c5"). InnerVolumeSpecName "kube-api-access-fn5z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.597347 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ad06650-7efc-4845-af41-e9b84edef7c5" (UID: "3ad06650-7efc-4845-af41-e9b84edef7c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.674906 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.674941 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn5z8\" (UniqueName: \"kubernetes.io/projected/3ad06650-7efc-4845-af41-e9b84edef7c5-kube-api-access-fn5z8\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:50 crc kubenswrapper[4806]: I0217 15:36:50.674952 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad06650-7efc-4845-af41-e9b84edef7c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:36:51 crc kubenswrapper[4806]: I0217 15:36:51.143192 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqqr8" event={"ID":"3ad06650-7efc-4845-af41-e9b84edef7c5","Type":"ContainerDied","Data":"1efebb42c4b75277545bb2507f205518e2046fbcbed2cb01500488846c5b63da"} Feb 17 15:36:51 crc kubenswrapper[4806]: I0217 15:36:51.143240 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqqr8" Feb 17 15:36:51 crc kubenswrapper[4806]: I0217 15:36:51.143384 4806 scope.go:117] "RemoveContainer" containerID="5f8f4d1832eec50e75deb133e54581497c666cda43dd81cf991935e24d0821df" Feb 17 15:36:51 crc kubenswrapper[4806]: I0217 15:36:51.175178 4806 scope.go:117] "RemoveContainer" containerID="fb4c5b2e45b77df3510efb0c9970d1ce3117b9cd3e3a6a5d6eb5a7c8ea6a5023" Feb 17 15:36:51 crc kubenswrapper[4806]: I0217 15:36:51.182879 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqqr8"] Feb 17 15:36:51 crc kubenswrapper[4806]: I0217 15:36:51.188807 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqqr8"] Feb 17 15:36:51 crc kubenswrapper[4806]: I0217 15:36:51.222493 4806 scope.go:117] "RemoveContainer" containerID="bfc57ea49c21927a0bbde97ee158672303228bc513815d44643bc835367b8449" Feb 17 15:36:53 crc kubenswrapper[4806]: I0217 15:36:53.168154 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" path="/var/lib/kubelet/pods/3ad06650-7efc-4845-af41-e9b84edef7c5/volumes" Feb 17 15:37:01 crc kubenswrapper[4806]: I0217 15:37:01.232160 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s4pmr" event={"ID":"590c794e-303e-4a1b-bb48-846372bf5d53","Type":"ContainerStarted","Data":"cd549971250af6344d561c17c9e7ae17b2a2b64bf0366f555b6c8ff4905e205e"} Feb 17 15:37:01 crc kubenswrapper[4806]: I0217 15:37:01.254078 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-s4pmr" podStartSLOduration=6.446968391 podStartE2EDuration="17.254059098s" podCreationTimestamp="2026-02-17 15:36:44 +0000 UTC" firstStartedPulling="2026-02-17 15:36:48.909393664 +0000 UTC m=+970.440024075" lastFinishedPulling="2026-02-17 15:36:59.716484371 +0000 UTC m=+981.247114782" observedRunningTime="2026-02-17 15:37:01.252785516 +0000 UTC m=+982.783415967" watchObservedRunningTime="2026-02-17 15:37:01.254059098 +0000 UTC m=+982.784689529" Feb 17 15:37:04 crc kubenswrapper[4806]: I0217 15:37:04.784684 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:37:04 crc kubenswrapper[4806]: I0217 15:37:04.785593 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:37:07 crc kubenswrapper[4806]: I0217 15:37:07.533683 4806 generic.go:334] "Generic (PLEG): container finished" podID="590c794e-303e-4a1b-bb48-846372bf5d53" containerID="cd549971250af6344d561c17c9e7ae17b2a2b64bf0366f555b6c8ff4905e205e" exitCode=0 Feb 17 15:37:07 crc kubenswrapper[4806]: I0217 15:37:07.533781 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s4pmr" event={"ID":"590c794e-303e-4a1b-bb48-846372bf5d53","Type":"ContainerDied","Data":"cd549971250af6344d561c17c9e7ae17b2a2b64bf0366f555b6c8ff4905e205e"} Feb 17 15:37:08 crc kubenswrapper[4806]: I0217 15:37:08.875589 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:37:08 crc kubenswrapper[4806]: I0217 15:37:08.973956 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-db-sync-config-data\") pod \"590c794e-303e-4a1b-bb48-846372bf5d53\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " Feb 17 15:37:08 crc kubenswrapper[4806]: I0217 15:37:08.974036 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cp8x\" (UniqueName: \"kubernetes.io/projected/590c794e-303e-4a1b-bb48-846372bf5d53-kube-api-access-9cp8x\") pod \"590c794e-303e-4a1b-bb48-846372bf5d53\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " Feb 17 15:37:08 crc kubenswrapper[4806]: I0217 15:37:08.974154 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-config-data\") pod \"590c794e-303e-4a1b-bb48-846372bf5d53\" (UID: \"590c794e-303e-4a1b-bb48-846372bf5d53\") " Feb 17 15:37:08 crc kubenswrapper[4806]: I0217 15:37:08.980697 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590c794e-303e-4a1b-bb48-846372bf5d53-kube-api-access-9cp8x" (OuterVolumeSpecName: "kube-api-access-9cp8x") pod "590c794e-303e-4a1b-bb48-846372bf5d53" (UID: "590c794e-303e-4a1b-bb48-846372bf5d53"). InnerVolumeSpecName "kube-api-access-9cp8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:37:08 crc kubenswrapper[4806]: I0217 15:37:08.981455 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "590c794e-303e-4a1b-bb48-846372bf5d53" (UID: "590c794e-303e-4a1b-bb48-846372bf5d53"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:37:09 crc kubenswrapper[4806]: I0217 15:37:09.010186 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-config-data" (OuterVolumeSpecName: "config-data") pod "590c794e-303e-4a1b-bb48-846372bf5d53" (UID: "590c794e-303e-4a1b-bb48-846372bf5d53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:37:09 crc kubenswrapper[4806]: I0217 15:37:09.075825 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:09 crc kubenswrapper[4806]: I0217 15:37:09.075861 4806 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/590c794e-303e-4a1b-bb48-846372bf5d53-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:09 crc kubenswrapper[4806]: I0217 15:37:09.075873 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cp8x\" (UniqueName: \"kubernetes.io/projected/590c794e-303e-4a1b-bb48-846372bf5d53-kube-api-access-9cp8x\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:09 crc kubenswrapper[4806]: I0217 15:37:09.550916 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-s4pmr" event={"ID":"590c794e-303e-4a1b-bb48-846372bf5d53","Type":"ContainerDied","Data":"0b98196fab561776375e24f6cb01a983a9e307c4eaea5fbedc88f8e36ff107e1"} Feb 17 15:37:09 crc kubenswrapper[4806]: I0217 15:37:09.550961 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-s4pmr" Feb 17 15:37:09 crc kubenswrapper[4806]: I0217 15:37:09.550977 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b98196fab561776375e24f6cb01a983a9e307c4eaea5fbedc88f8e36ff107e1" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.861837 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:37:10 crc kubenswrapper[4806]: E0217 15:37:10.862103 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="extract-content" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.862117 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="extract-content" Feb 17 15:37:10 crc kubenswrapper[4806]: E0217 15:37:10.862131 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="registry-server" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.862137 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="registry-server" Feb 17 15:37:10 crc kubenswrapper[4806]: E0217 15:37:10.862149 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590c794e-303e-4a1b-bb48-846372bf5d53" containerName="glance-db-sync" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.862155 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="590c794e-303e-4a1b-bb48-846372bf5d53" containerName="glance-db-sync" Feb 17 15:37:10 crc kubenswrapper[4806]: E0217 15:37:10.862167 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="extract-utilities" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.862173 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="extract-utilities" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.862287 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad06650-7efc-4845-af41-e9b84edef7c5" containerName="registry-server" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.862299 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="590c794e-303e-4a1b-bb48-846372bf5d53" containerName="glance-db-sync" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.862952 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.865427 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.866233 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-q4wvc" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.870740 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.878247 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.936635 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.972058 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:10 crc kubenswrapper[4806]: I0217 15:37:10.976613 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.003572 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.003936 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-sys\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004059 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-logs\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004215 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-lib-modules\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004322 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-scripts\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004449 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004606 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004725 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-dev\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004827 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-httpd-run\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.004984 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hskx6\" (UniqueName: \"kubernetes.io/projected/f93031a1-739a-4e8c-8dcf-04518d36e0f8-kube-api-access-hskx6\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.005100 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-config-data\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.005252 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.005361 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-nvme\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.005517 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-run\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107476 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107537 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hskx6\" (UniqueName: \"kubernetes.io/projected/f93031a1-739a-4e8c-8dcf-04518d36e0f8-kube-api-access-hskx6\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107559 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-lib-modules\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107582 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-config-data\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107614 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107633 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-run\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107656 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxr5\" (UniqueName: \"kubernetes.io/projected/c4c2e054-1dab-449c-9de6-9709e4f6d001-kube-api-access-8cxr5\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107682 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107704 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-nvme\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107728 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107754 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-run\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107773 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-config-data\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107795 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-nvme\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107822 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107837 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-logs\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107857 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-sys\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107874 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-logs\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107899 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-httpd-run\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107925 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-lib-modules\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107945 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-scripts\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107966 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.107985 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108005 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-dev\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108026 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-httpd-run\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108044 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108070 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-dev\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108090 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-scripts\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108108 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-sys\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108211 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108279 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-nvme\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108502 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-run\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108794 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.108840 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-sys\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.109303 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-logs\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.109352 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-lib-modules\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.109550 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.109617 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-dev\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.109671 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.109977 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-httpd-run\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.120892 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-config-data\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.123051 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-scripts\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.137989 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hskx6\" (UniqueName: \"kubernetes.io/projected/f93031a1-739a-4e8c-8dcf-04518d36e0f8-kube-api-access-hskx6\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.140653 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.142214 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.182031 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209298 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-lib-modules\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209363 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209384 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-run\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209432 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxr5\" (UniqueName: \"kubernetes.io/projected/c4c2e054-1dab-449c-9de6-9709e4f6d001-kube-api-access-8cxr5\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209470 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209504 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-config-data\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209507 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209511 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-lib-modules\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209539 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-nvme\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209599 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-nvme\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209619 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-logs\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209670 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-httpd-run\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209728 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.209934 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-run\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210350 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-httpd-run\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210431 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210469 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-dev\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210496 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-scripts\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210521 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-sys\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210545 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210608 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210632 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210667 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-dev\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210486 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-logs\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.210714 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-sys\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.214462 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-scripts\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.223471 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-config-data\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.227138 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxr5\" (UniqueName: \"kubernetes.io/projected/c4c2e054-1dab-449c-9de6-9709e4f6d001-kube-api-access-8cxr5\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.240660 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.242543 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.286605 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.472007 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.570540 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f93031a1-739a-4e8c-8dcf-04518d36e0f8","Type":"ContainerStarted","Data":"fde3e5ff8c519fcf2663b943dab22b77d2f8913ffca201213e6f916a2693c4cb"} Feb 17 15:37:11 crc kubenswrapper[4806]: I0217 15:37:11.765525 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:11 crc kubenswrapper[4806]: W0217 15:37:11.767555 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4c2e054_1dab_449c_9de6_9709e4f6d001.slice/crio-1e2ebb142922a29f80042505175b299f3519c1101b04af4c0be4b8b8420c78c1 WatchSource:0}: Error finding container 1e2ebb142922a29f80042505175b299f3519c1101b04af4c0be4b8b8420c78c1: Status 404 returned error can't find the container with id 1e2ebb142922a29f80042505175b299f3519c1101b04af4c0be4b8b8420c78c1 Feb 17 15:37:12 crc kubenswrapper[4806]: I0217 15:37:12.578793 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f93031a1-739a-4e8c-8dcf-04518d36e0f8","Type":"ContainerStarted","Data":"37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637"} Feb 17 15:37:12 crc kubenswrapper[4806]: I0217 15:37:12.579440 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f93031a1-739a-4e8c-8dcf-04518d36e0f8","Type":"ContainerStarted","Data":"35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7"} Feb 17 15:37:12 crc kubenswrapper[4806]: I0217 15:37:12.581889 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c4c2e054-1dab-449c-9de6-9709e4f6d001","Type":"ContainerStarted","Data":"6ac3ff45a749230df5201c9c49bdd0ac711fc3bb6744252418165377093eb6b5"} Feb 17 15:37:12 crc kubenswrapper[4806]: I0217 15:37:12.581934 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c4c2e054-1dab-449c-9de6-9709e4f6d001","Type":"ContainerStarted","Data":"7edb790f6c5ad591181fb2a53f5125c129e7e8279796108b2b0fc5f339f66354"} Feb 17 15:37:12 crc kubenswrapper[4806]: I0217 15:37:12.581949 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c4c2e054-1dab-449c-9de6-9709e4f6d001","Type":"ContainerStarted","Data":"1e2ebb142922a29f80042505175b299f3519c1101b04af4c0be4b8b8420c78c1"} Feb 17 15:37:12 crc kubenswrapper[4806]: I0217 15:37:12.611597 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.611570113 podStartE2EDuration="2.611570113s" podCreationTimestamp="2026-02-17 15:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:37:12.606523508 +0000 UTC m=+994.137153939" watchObservedRunningTime="2026-02-17 15:37:12.611570113 +0000 UTC m=+994.142200554" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.182249 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.184117 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.223828 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.238809 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.259164 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=12.259144474 podStartE2EDuration="12.259144474s" podCreationTimestamp="2026-02-17 15:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:37:12.641905884 +0000 UTC m=+994.172536325" watchObservedRunningTime="2026-02-17 15:37:21.259144474 +0000 UTC m=+1002.789774885" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.286834 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.288828 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.330858 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.340642 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.672573 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.672632 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.672645 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:21 crc kubenswrapper[4806]: I0217 15:37:21.672657 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:23 crc kubenswrapper[4806]: I0217 15:37:23.709661 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:23 crc kubenswrapper[4806]: I0217 15:37:23.710230 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:37:23 crc kubenswrapper[4806]: I0217 15:37:23.726102 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:23 crc kubenswrapper[4806]: I0217 15:37:23.726261 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:37:23 crc kubenswrapper[4806]: I0217 15:37:23.734997 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:37:23 crc kubenswrapper[4806]: I0217 15:37:23.814515 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:23 crc kubenswrapper[4806]: I0217 15:37:23.870442 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:24 crc kubenswrapper[4806]: I0217 15:37:24.699169 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-log" containerID="cri-o://7edb790f6c5ad591181fb2a53f5125c129e7e8279796108b2b0fc5f339f66354" gracePeriod=30 Feb 17 15:37:24 crc kubenswrapper[4806]: I0217 15:37:24.699795 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-httpd" containerID="cri-o://6ac3ff45a749230df5201c9c49bdd0ac711fc3bb6744252418165377093eb6b5" gracePeriod=30 Feb 17 15:37:24 crc kubenswrapper[4806]: I0217 15:37:24.705201 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.103:9292/healthcheck\": EOF" Feb 17 15:37:24 crc kubenswrapper[4806]: I0217 15:37:24.705333 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.103:9292/healthcheck\": EOF" Feb 17 15:37:25 crc kubenswrapper[4806]: I0217 15:37:25.713822 4806 generic.go:334] "Generic (PLEG): container finished" podID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerID="7edb790f6c5ad591181fb2a53f5125c129e7e8279796108b2b0fc5f339f66354" exitCode=143 Feb 17 15:37:25 crc kubenswrapper[4806]: I0217 15:37:25.714272 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c4c2e054-1dab-449c-9de6-9709e4f6d001","Type":"ContainerDied","Data":"7edb790f6c5ad591181fb2a53f5125c129e7e8279796108b2b0fc5f339f66354"} Feb 17 15:37:29 crc kubenswrapper[4806]: I0217 15:37:29.779627 4806 generic.go:334] "Generic (PLEG): container finished" podID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerID="6ac3ff45a749230df5201c9c49bdd0ac711fc3bb6744252418165377093eb6b5" exitCode=0 Feb 17 15:37:29 crc kubenswrapper[4806]: I0217 15:37:29.780058 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c4c2e054-1dab-449c-9de6-9709e4f6d001","Type":"ContainerDied","Data":"6ac3ff45a749230df5201c9c49bdd0ac711fc3bb6744252418165377093eb6b5"} Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.188709 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265032 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-logs\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265134 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-lib-modules\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265182 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-httpd-run\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265230 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-iscsi\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265264 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-scripts\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265296 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cxr5\" (UniqueName: \"kubernetes.io/projected/c4c2e054-1dab-449c-9de6-9709e4f6d001-kube-api-access-8cxr5\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265329 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-var-locks-brick\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265349 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-dev\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265384 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-config-data\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265421 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-nvme\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265458 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-run\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265449 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265449 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265477 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265515 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-dev" (OuterVolumeSpecName: "dev") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265571 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-sys\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265606 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"c4c2e054-1dab-449c-9de6-9709e4f6d001\" (UID: \"c4c2e054-1dab-449c-9de6-9709e4f6d001\") " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265681 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265754 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265798 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265774 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-sys" (OuterVolumeSpecName: "sys") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265838 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-run" (OuterVolumeSpecName: "run") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.265928 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-logs" (OuterVolumeSpecName: "logs") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267006 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267038 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267055 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c4c2e054-1dab-449c-9de6-9709e4f6d001-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267072 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267086 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267103 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267117 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267131 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.267145 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4c2e054-1dab-449c-9de6-9709e4f6d001-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.272648 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-scripts" (OuterVolumeSpecName: "scripts") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.272863 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c2e054-1dab-449c-9de6-9709e4f6d001-kube-api-access-8cxr5" (OuterVolumeSpecName: "kube-api-access-8cxr5") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "kube-api-access-8cxr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.272649 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.290990 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.338730 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-config-data" (OuterVolumeSpecName: "config-data") pod "c4c2e054-1dab-449c-9de6-9709e4f6d001" (UID: "c4c2e054-1dab-449c-9de6-9709e4f6d001"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.369023 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.369058 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cxr5\" (UniqueName: \"kubernetes.io/projected/c4c2e054-1dab-449c-9de6-9709e4f6d001-kube-api-access-8cxr5\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.369073 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4c2e054-1dab-449c-9de6-9709e4f6d001-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.369105 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.369122 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.385417 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.385611 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.470509 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.470551 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.790805 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"c4c2e054-1dab-449c-9de6-9709e4f6d001","Type":"ContainerDied","Data":"1e2ebb142922a29f80042505175b299f3519c1101b04af4c0be4b8b8420c78c1"} Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.790863 4806 scope.go:117] "RemoveContainer" containerID="6ac3ff45a749230df5201c9c49bdd0ac711fc3bb6744252418165377093eb6b5" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.790880 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.823573 4806 scope.go:117] "RemoveContainer" containerID="7edb790f6c5ad591181fb2a53f5125c129e7e8279796108b2b0fc5f339f66354" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.852338 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.858385 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.865679 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:30 crc kubenswrapper[4806]: E0217 15:37:30.866039 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-httpd" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.866063 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-httpd" Feb 17 15:37:30 crc kubenswrapper[4806]: E0217 15:37:30.866081 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-log" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.866089 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-log" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.866240 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-httpd" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.866263 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" containerName="glance-log" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.866956 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.877056 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.979806 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.979845 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-scripts\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.979869 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-sys\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.979885 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjdlw\" (UniqueName: \"kubernetes.io/projected/2da07ff2-69c2-48de-82f3-b7516be8ed6d-kube-api-access-pjdlw\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.979920 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-logs\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980007 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-httpd-run\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980109 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980149 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-run\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980170 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980211 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-lib-modules\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980259 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980284 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-dev\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980353 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-config-data\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:30 crc kubenswrapper[4806]: I0217 15:37:30.980378 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082157 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082198 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-scripts\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082225 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-sys\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082246 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjdlw\" (UniqueName: \"kubernetes.io/projected/2da07ff2-69c2-48de-82f3-b7516be8ed6d-kube-api-access-pjdlw\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082293 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-logs\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082315 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-httpd-run\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082358 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082381 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-run\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082394 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082421 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082507 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-lib-modules\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082518 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.082779 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-lib-modules\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083127 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-logs\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083186 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083500 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-sys\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083533 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-httpd-run\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083562 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-run\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083700 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083734 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-dev\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083787 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083816 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-dev\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083818 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-config-data\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.083859 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.084048 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.089119 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-config-data\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.098182 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-scripts\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.104894 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjdlw\" (UniqueName: \"kubernetes.io/projected/2da07ff2-69c2-48de-82f3-b7516be8ed6d-kube-api-access-pjdlw\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.105984 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.109178 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.170324 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c2e054-1dab-449c-9de6-9709e4f6d001" path="/var/lib/kubelet/pods/c4c2e054-1dab-449c-9de6-9709e4f6d001/volumes" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.193037 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.601068 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.799377 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2da07ff2-69c2-48de-82f3-b7516be8ed6d","Type":"ContainerStarted","Data":"2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4"} Feb 17 15:37:31 crc kubenswrapper[4806]: I0217 15:37:31.799663 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2da07ff2-69c2-48de-82f3-b7516be8ed6d","Type":"ContainerStarted","Data":"9519abfbc3c4aeb0b67d9f4bc1dc0dd217516ee5d31d3f42860aa8c945ad46c9"} Feb 17 15:37:32 crc kubenswrapper[4806]: I0217 15:37:32.810287 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2da07ff2-69c2-48de-82f3-b7516be8ed6d","Type":"ContainerStarted","Data":"7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7"} Feb 17 15:37:32 crc kubenswrapper[4806]: I0217 15:37:32.830102 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.8300852709999997 podStartE2EDuration="2.830085271s" podCreationTimestamp="2026-02-17 15:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:37:32.828229095 +0000 UTC m=+1014.358859516" watchObservedRunningTime="2026-02-17 15:37:32.830085271 +0000 UTC m=+1014.360715682" Feb 17 15:37:34 crc kubenswrapper[4806]: I0217 15:37:34.785236 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:37:34 crc kubenswrapper[4806]: I0217 15:37:34.785935 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:37:41 crc kubenswrapper[4806]: I0217 15:37:41.193304 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:41 crc kubenswrapper[4806]: I0217 15:37:41.194018 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:41 crc kubenswrapper[4806]: I0217 15:37:41.230804 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:41 crc kubenswrapper[4806]: I0217 15:37:41.243689 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:41 crc kubenswrapper[4806]: I0217 15:37:41.894621 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:41 crc kubenswrapper[4806]: I0217 15:37:41.894659 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:43 crc kubenswrapper[4806]: I0217 15:37:43.827101 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:43 crc kubenswrapper[4806]: I0217 15:37:43.906870 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:37:43 crc kubenswrapper[4806]: I0217 15:37:43.979730 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.104112 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s4pmr"] Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.112210 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-s4pmr"] Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.183613 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancef248-account-delete-rg4kp"] Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.184736 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.204560 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef248-account-delete-rg4kp"] Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.210304 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.210590 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-log" containerID="cri-o://35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7" gracePeriod=30 Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.210726 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-httpd" containerID="cri-o://37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637" gracePeriod=30 Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.221599 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.221835 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-log" containerID="cri-o://2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4" gracePeriod=30 Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.221968 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-httpd" containerID="cri-o://7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7" gracePeriod=30 Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.263537 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txl2t\" (UniqueName: \"kubernetes.io/projected/6f83c637-2c91-4a04-b861-8fbe7ef9f798-kube-api-access-txl2t\") pod \"glancef248-account-delete-rg4kp\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.263602 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f83c637-2c91-4a04-b861-8fbe7ef9f798-operator-scripts\") pod \"glancef248-account-delete-rg4kp\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.306284 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.306759 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="c706f003-5196-40be-8d86-015f57f65c3f" containerName="openstackclient" containerID="cri-o://d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324" gracePeriod=30 Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.364785 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txl2t\" (UniqueName: \"kubernetes.io/projected/6f83c637-2c91-4a04-b861-8fbe7ef9f798-kube-api-access-txl2t\") pod \"glancef248-account-delete-rg4kp\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.364831 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f83c637-2c91-4a04-b861-8fbe7ef9f798-operator-scripts\") pod \"glancef248-account-delete-rg4kp\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.365668 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f83c637-2c91-4a04-b861-8fbe7ef9f798-operator-scripts\") pod \"glancef248-account-delete-rg4kp\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.385789 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txl2t\" (UniqueName: \"kubernetes.io/projected/6f83c637-2c91-4a04-b861-8fbe7ef9f798-kube-api-access-txl2t\") pod \"glancef248-account-delete-rg4kp\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.501849 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.693830 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.770691 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config\") pod \"c706f003-5196-40be-8d86-015f57f65c3f\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.770757 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-scripts\") pod \"c706f003-5196-40be-8d86-015f57f65c3f\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.770855 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config-secret\") pod \"c706f003-5196-40be-8d86-015f57f65c3f\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.770950 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnvmp\" (UniqueName: \"kubernetes.io/projected/c706f003-5196-40be-8d86-015f57f65c3f-kube-api-access-tnvmp\") pod \"c706f003-5196-40be-8d86-015f57f65c3f\" (UID: \"c706f003-5196-40be-8d86-015f57f65c3f\") " Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.772055 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "c706f003-5196-40be-8d86-015f57f65c3f" (UID: "c706f003-5196-40be-8d86-015f57f65c3f"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.775370 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c706f003-5196-40be-8d86-015f57f65c3f-kube-api-access-tnvmp" (OuterVolumeSpecName: "kube-api-access-tnvmp") pod "c706f003-5196-40be-8d86-015f57f65c3f" (UID: "c706f003-5196-40be-8d86-015f57f65c3f"). InnerVolumeSpecName "kube-api-access-tnvmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.789708 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c706f003-5196-40be-8d86-015f57f65c3f" (UID: "c706f003-5196-40be-8d86-015f57f65c3f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.793645 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c706f003-5196-40be-8d86-015f57f65c3f" (UID: "c706f003-5196-40be-8d86-015f57f65c3f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.872778 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnvmp\" (UniqueName: \"kubernetes.io/projected/c706f003-5196-40be-8d86-015f57f65c3f-kube-api-access-tnvmp\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.872817 4806 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.873177 4806 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/c706f003-5196-40be-8d86-015f57f65c3f-openstack-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.873190 4806 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c706f003-5196-40be-8d86-015f57f65c3f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 15:37:58 crc kubenswrapper[4806]: I0217 15:37:58.992750 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef248-account-delete-rg4kp"] Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.037793 4806 generic.go:334] "Generic (PLEG): container finished" podID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerID="35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7" exitCode=143 Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.037848 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f93031a1-739a-4e8c-8dcf-04518d36e0f8","Type":"ContainerDied","Data":"35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7"} Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.039199 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" event={"ID":"6f83c637-2c91-4a04-b861-8fbe7ef9f798","Type":"ContainerStarted","Data":"48e46e61d0090f755949208bcf390dc7539c0fc35c65f0d76e05c2f1e30d014c"} Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.040943 4806 generic.go:334] "Generic (PLEG): container finished" podID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerID="2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4" exitCode=143 Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.040978 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2da07ff2-69c2-48de-82f3-b7516be8ed6d","Type":"ContainerDied","Data":"2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4"} Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.043281 4806 generic.go:334] "Generic (PLEG): container finished" podID="c706f003-5196-40be-8d86-015f57f65c3f" containerID="d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324" exitCode=143 Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.043305 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"c706f003-5196-40be-8d86-015f57f65c3f","Type":"ContainerDied","Data":"d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324"} Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.043318 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"c706f003-5196-40be-8d86-015f57f65c3f","Type":"ContainerDied","Data":"682a9037a7c6f81f52f1d47645b8c8157c9b4c848035cfaa23a11160f836af7b"} Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.043334 4806 scope.go:117] "RemoveContainer" containerID="d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324" Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.043445 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.087906 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.088433 4806 scope.go:117] "RemoveContainer" containerID="d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324" Feb 17 15:37:59 crc kubenswrapper[4806]: E0217 15:37:59.088871 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324\": container with ID starting with d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324 not found: ID does not exist" containerID="d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324" Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.088908 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324"} err="failed to get container status \"d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324\": rpc error: code = NotFound desc = could not find container \"d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324\": container with ID starting with d4cfa973e7126f5766516f710981bb2c90e3de13c3442c607b8e75807aef1324 not found: ID does not exist" Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.094442 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.169465 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590c794e-303e-4a1b-bb48-846372bf5d53" path="/var/lib/kubelet/pods/590c794e-303e-4a1b-bb48-846372bf5d53/volumes" Feb 17 15:37:59 crc kubenswrapper[4806]: I0217 15:37:59.170347 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c706f003-5196-40be-8d86-015f57f65c3f" path="/var/lib/kubelet/pods/c706f003-5196-40be-8d86-015f57f65c3f/volumes" Feb 17 15:38:00 crc kubenswrapper[4806]: I0217 15:38:00.052750 4806 generic.go:334] "Generic (PLEG): container finished" podID="6f83c637-2c91-4a04-b861-8fbe7ef9f798" containerID="696f7ec3bc821bbccf1af2780552bd1436404e92fc56a9bc576b8f2218ed1644" exitCode=0 Feb 17 15:38:00 crc kubenswrapper[4806]: I0217 15:38:00.052827 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" event={"ID":"6f83c637-2c91-4a04-b861-8fbe7ef9f798","Type":"ContainerDied","Data":"696f7ec3bc821bbccf1af2780552bd1436404e92fc56a9bc576b8f2218ed1644"} Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.370589 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.104:9292/healthcheck\": read tcp 10.217.0.2:38196->10.217.0.104:9292: read: connection reset by peer" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.372347 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.104:9292/healthcheck\": read tcp 10.217.0.2:38190->10.217.0.104:9292: read: connection reset by peer" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.612236 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.719244 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txl2t\" (UniqueName: \"kubernetes.io/projected/6f83c637-2c91-4a04-b861-8fbe7ef9f798-kube-api-access-txl2t\") pod \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.719295 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f83c637-2c91-4a04-b861-8fbe7ef9f798-operator-scripts\") pod \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\" (UID: \"6f83c637-2c91-4a04-b861-8fbe7ef9f798\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.720550 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f83c637-2c91-4a04-b861-8fbe7ef9f798-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f83c637-2c91-4a04-b861-8fbe7ef9f798" (UID: "6f83c637-2c91-4a04-b861-8fbe7ef9f798"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.723236 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.728126 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f83c637-2c91-4a04-b861-8fbe7ef9f798-kube-api-access-txl2t" (OuterVolumeSpecName: "kube-api-access-txl2t") pod "6f83c637-2c91-4a04-b861-8fbe7ef9f798" (UID: "6f83c637-2c91-4a04-b861-8fbe7ef9f798"). InnerVolumeSpecName "kube-api-access-txl2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.772646 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820235 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-sys\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820269 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-lib-modules\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820287 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-nvme\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820345 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-httpd-run\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820368 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-logs\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820419 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-iscsi\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820442 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820461 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-run\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820486 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820507 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-dev\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820539 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-scripts\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820553 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-var-locks-brick\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820586 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-config-data\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820611 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hskx6\" (UniqueName: \"kubernetes.io/projected/f93031a1-739a-4e8c-8dcf-04518d36e0f8-kube-api-access-hskx6\") pod \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\" (UID: \"f93031a1-739a-4e8c-8dcf-04518d36e0f8\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820845 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txl2t\" (UniqueName: \"kubernetes.io/projected/6f83c637-2c91-4a04-b861-8fbe7ef9f798-kube-api-access-txl2t\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.820857 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f83c637-2c91-4a04-b861-8fbe7ef9f798-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821341 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821387 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-sys" (OuterVolumeSpecName: "sys") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821386 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-dev" (OuterVolumeSpecName: "dev") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821428 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-run" (OuterVolumeSpecName: "run") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821438 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821703 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821717 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821870 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-logs" (OuterVolumeSpecName: "logs") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.821935 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.825299 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.825332 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.825546 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-scripts" (OuterVolumeSpecName: "scripts") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.825643 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93031a1-739a-4e8c-8dcf-04518d36e0f8-kube-api-access-hskx6" (OuterVolumeSpecName: "kube-api-access-hskx6") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "kube-api-access-hskx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.867751 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-config-data" (OuterVolumeSpecName: "config-data") pod "f93031a1-739a-4e8c-8dcf-04518d36e0f8" (UID: "f93031a1-739a-4e8c-8dcf-04518d36e0f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.921963 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-sys\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922049 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922071 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922108 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-logs\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922158 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-iscsi\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922193 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-run\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922243 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-nvme\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922291 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-scripts\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922326 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-dev\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922360 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-lib-modules\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922415 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjdlw\" (UniqueName: \"kubernetes.io/projected/2da07ff2-69c2-48de-82f3-b7516be8ed6d-kube-api-access-pjdlw\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922448 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-var-locks-brick\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922467 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-config-data\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922493 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-httpd-run\") pod \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\" (UID: \"2da07ff2-69c2-48de-82f3-b7516be8ed6d\") " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922857 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922871 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922880 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922890 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93031a1-739a-4e8c-8dcf-04518d36e0f8-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922900 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922921 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922930 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922944 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922953 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922962 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.922971 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.923002 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93031a1-739a-4e8c-8dcf-04518d36e0f8-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.923012 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hskx6\" (UniqueName: \"kubernetes.io/projected/f93031a1-739a-4e8c-8dcf-04518d36e0f8-kube-api-access-hskx6\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.923021 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f93031a1-739a-4e8c-8dcf-04518d36e0f8-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.925734 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.925841 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-dev" (OuterVolumeSpecName: "dev") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.925872 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.926034 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.926167 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-logs" (OuterVolumeSpecName: "logs") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.926216 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-sys" (OuterVolumeSpecName: "sys") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.926397 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-run" (OuterVolumeSpecName: "run") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.926444 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.926473 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.928963 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da07ff2-69c2-48de-82f3-b7516be8ed6d-kube-api-access-pjdlw" (OuterVolumeSpecName: "kube-api-access-pjdlw") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "kube-api-access-pjdlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.931819 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-scripts" (OuterVolumeSpecName: "scripts") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.934355 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.942730 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.954570 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.955155 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 15:38:01 crc kubenswrapper[4806]: I0217 15:38:01.976120 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-config-data" (OuterVolumeSpecName: "config-data") pod "2da07ff2-69c2-48de-82f3-b7516be8ed6d" (UID: "2da07ff2-69c2-48de-82f3-b7516be8ed6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024613 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024670 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024683 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024697 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024709 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024719 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024730 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjdlw\" (UniqueName: \"kubernetes.io/projected/2da07ff2-69c2-48de-82f3-b7516be8ed6d-kube-api-access-pjdlw\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024743 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024753 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da07ff2-69c2-48de-82f3-b7516be8ed6d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024763 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024774 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024786 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2da07ff2-69c2-48de-82f3-b7516be8ed6d-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024796 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024834 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024850 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.024862 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da07ff2-69c2-48de-82f3-b7516be8ed6d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.040438 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.040640 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.076471 4806 generic.go:334] "Generic (PLEG): container finished" podID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerID="37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637" exitCode=0 Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.076534 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f93031a1-739a-4e8c-8dcf-04518d36e0f8","Type":"ContainerDied","Data":"37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637"} Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.076560 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"f93031a1-739a-4e8c-8dcf-04518d36e0f8","Type":"ContainerDied","Data":"fde3e5ff8c519fcf2663b943dab22b77d2f8913ffca201213e6f916a2693c4cb"} Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.076579 4806 scope.go:117] "RemoveContainer" containerID="37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.076701 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.080963 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" event={"ID":"6f83c637-2c91-4a04-b861-8fbe7ef9f798","Type":"ContainerDied","Data":"48e46e61d0090f755949208bcf390dc7539c0fc35c65f0d76e05c2f1e30d014c"} Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.080999 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef248-account-delete-rg4kp" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.081122 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48e46e61d0090f755949208bcf390dc7539c0fc35c65f0d76e05c2f1e30d014c" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.082943 4806 generic.go:334] "Generic (PLEG): container finished" podID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerID="7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7" exitCode=0 Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.082971 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2da07ff2-69c2-48de-82f3-b7516be8ed6d","Type":"ContainerDied","Data":"7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7"} Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.082987 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"2da07ff2-69c2-48de-82f3-b7516be8ed6d","Type":"ContainerDied","Data":"9519abfbc3c4aeb0b67d9f4bc1dc0dd217516ee5d31d3f42860aa8c945ad46c9"} Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.083012 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.103924 4806 scope.go:117] "RemoveContainer" containerID="35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.117267 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.124035 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.126566 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.126639 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.138517 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.141267 4806 scope.go:117] "RemoveContainer" containerID="37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637" Feb 17 15:38:02 crc kubenswrapper[4806]: E0217 15:38:02.141869 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637\": container with ID starting with 37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637 not found: ID does not exist" containerID="37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.141911 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637"} err="failed to get container status \"37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637\": rpc error: code = NotFound desc = could not find container \"37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637\": container with ID starting with 37b475484cf3c07120280bc00b7f7e9c8e5363a418caf12aea21b99346551637 not found: ID does not exist" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.141937 4806 scope.go:117] "RemoveContainer" containerID="35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7" Feb 17 15:38:02 crc kubenswrapper[4806]: E0217 15:38:02.142569 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7\": container with ID starting with 35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7 not found: ID does not exist" containerID="35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.142646 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7"} err="failed to get container status \"35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7\": rpc error: code = NotFound desc = could not find container \"35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7\": container with ID starting with 35172432bf35f399e8d4f42643109d139cf9df95e94302e2171da785cc0c26b7 not found: ID does not exist" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.142665 4806 scope.go:117] "RemoveContainer" containerID="7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.144152 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.160769 4806 scope.go:117] "RemoveContainer" containerID="2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.189170 4806 scope.go:117] "RemoveContainer" containerID="7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7" Feb 17 15:38:02 crc kubenswrapper[4806]: E0217 15:38:02.189899 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7\": container with ID starting with 7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7 not found: ID does not exist" containerID="7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.189928 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7"} err="failed to get container status \"7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7\": rpc error: code = NotFound desc = could not find container \"7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7\": container with ID starting with 7d05c4e53eb08f457c5f9bba3f1549fdc4aa13ed8385a3831a40aa0c1e5c46a7 not found: ID does not exist" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.189950 4806 scope.go:117] "RemoveContainer" containerID="2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4" Feb 17 15:38:02 crc kubenswrapper[4806]: E0217 15:38:02.190550 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4\": container with ID starting with 2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4 not found: ID does not exist" containerID="2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4" Feb 17 15:38:02 crc kubenswrapper[4806]: I0217 15:38:02.190572 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4"} err="failed to get container status \"2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4\": rpc error: code = NotFound desc = could not find container \"2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4\": container with ID starting with 2f640d63f9a2f117170956d66d223919308972d20a52458633980a79fe829cb4 not found: ID does not exist" Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.171981 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" path="/var/lib/kubelet/pods/2da07ff2-69c2-48de-82f3-b7516be8ed6d/volumes" Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.173659 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" path="/var/lib/kubelet/pods/f93031a1-739a-4e8c-8dcf-04518d36e0f8/volumes" Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.223780 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-qbtnx"] Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.237962 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-qbtnx"] Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.249231 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-f248-account-create-update-7lszd"] Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.256261 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancef248-account-delete-rg4kp"] Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.262381 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-f248-account-create-update-7lszd"] Feb 17 15:38:03 crc kubenswrapper[4806]: I0217 15:38:03.273570 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancef248-account-delete-rg4kp"] Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.174501 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-sg7zz"] Feb 17 15:38:04 crc kubenswrapper[4806]: E0217 15:38:04.174864 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f83c637-2c91-4a04-b861-8fbe7ef9f798" containerName="mariadb-account-delete" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.174893 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f83c637-2c91-4a04-b861-8fbe7ef9f798" containerName="mariadb-account-delete" Feb 17 15:38:04 crc kubenswrapper[4806]: E0217 15:38:04.174906 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c706f003-5196-40be-8d86-015f57f65c3f" containerName="openstackclient" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.174913 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="c706f003-5196-40be-8d86-015f57f65c3f" containerName="openstackclient" Feb 17 15:38:04 crc kubenswrapper[4806]: E0217 15:38:04.174927 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-httpd" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.174935 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-httpd" Feb 17 15:38:04 crc kubenswrapper[4806]: E0217 15:38:04.174971 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-httpd" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.174979 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-httpd" Feb 17 15:38:04 crc kubenswrapper[4806]: E0217 15:38:04.174988 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-log" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.174994 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-log" Feb 17 15:38:04 crc kubenswrapper[4806]: E0217 15:38:04.175002 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-log" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175008 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-log" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175165 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="c706f003-5196-40be-8d86-015f57f65c3f" containerName="openstackclient" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175203 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f83c637-2c91-4a04-b861-8fbe7ef9f798" containerName="mariadb-account-delete" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175213 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-httpd" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175227 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da07ff2-69c2-48de-82f3-b7516be8ed6d" containerName="glance-log" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175233 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-log" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175242 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93031a1-739a-4e8c-8dcf-04518d36e0f8" containerName="glance-httpd" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.175883 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.181677 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn"] Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.183202 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.184779 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.198695 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn"] Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.248003 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-sg7zz"] Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.254897 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxx6t\" (UniqueName: \"kubernetes.io/projected/5ad740f0-0a0a-4461-8569-e1c4f41663c2-kube-api-access-zxx6t\") pod \"glance-db-create-sg7zz\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.254943 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ad740f0-0a0a-4461-8569-e1c4f41663c2-operator-scripts\") pod \"glance-db-create-sg7zz\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.356004 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-operator-scripts\") pod \"glance-3ba2-account-create-update-8xsdn\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.356084 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxx6t\" (UniqueName: \"kubernetes.io/projected/5ad740f0-0a0a-4461-8569-e1c4f41663c2-kube-api-access-zxx6t\") pod \"glance-db-create-sg7zz\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.356118 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ad740f0-0a0a-4461-8569-e1c4f41663c2-operator-scripts\") pod \"glance-db-create-sg7zz\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.356158 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6gf\" (UniqueName: \"kubernetes.io/projected/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-kube-api-access-pd6gf\") pod \"glance-3ba2-account-create-update-8xsdn\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.357318 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ad740f0-0a0a-4461-8569-e1c4f41663c2-operator-scripts\") pod \"glance-db-create-sg7zz\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.376019 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxx6t\" (UniqueName: \"kubernetes.io/projected/5ad740f0-0a0a-4461-8569-e1c4f41663c2-kube-api-access-zxx6t\") pod \"glance-db-create-sg7zz\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.458349 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-operator-scripts\") pod \"glance-3ba2-account-create-update-8xsdn\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.458566 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6gf\" (UniqueName: \"kubernetes.io/projected/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-kube-api-access-pd6gf\") pod \"glance-3ba2-account-create-update-8xsdn\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.459677 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-operator-scripts\") pod \"glance-3ba2-account-create-update-8xsdn\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.474638 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6gf\" (UniqueName: \"kubernetes.io/projected/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-kube-api-access-pd6gf\") pod \"glance-3ba2-account-create-update-8xsdn\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.498766 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.534909 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.784112 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.784372 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.784547 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.785125 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b16af821a4d64d3667cdbb96cac9edee38709ffe5e3eff30ff59770488d6700a"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:38:04 crc kubenswrapper[4806]: I0217 15:38:04.785176 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://b16af821a4d64d3667cdbb96cac9edee38709ffe5e3eff30ff59770488d6700a" gracePeriod=600 Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.103180 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-sg7zz"] Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.110821 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="b16af821a4d64d3667cdbb96cac9edee38709ffe5e3eff30ff59770488d6700a" exitCode=0 Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.110858 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"b16af821a4d64d3667cdbb96cac9edee38709ffe5e3eff30ff59770488d6700a"} Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.110886 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"23c0d5ed88fa20b6bb643ec42855fa80562a9cb2938d34d259221739e817be88"} Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.110903 4806 scope.go:117] "RemoveContainer" containerID="1045b513b3ae59cf6bef863ffd1792d3b68fa1978b67b7c641086276ee981395" Feb 17 15:38:05 crc kubenswrapper[4806]: W0217 15:38:05.111840 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ad740f0_0a0a_4461_8569_e1c4f41663c2.slice/crio-24298b4a78f62d5a4bcb72bf4f5b0392c9db4f2c25aa57fa4f492f384c780ffd WatchSource:0}: Error finding container 24298b4a78f62d5a4bcb72bf4f5b0392c9db4f2c25aa57fa4f492f384c780ffd: Status 404 returned error can't find the container with id 24298b4a78f62d5a4bcb72bf4f5b0392c9db4f2c25aa57fa4f492f384c780ffd Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.171744 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f83c637-2c91-4a04-b861-8fbe7ef9f798" path="/var/lib/kubelet/pods/6f83c637-2c91-4a04-b861-8fbe7ef9f798/volumes" Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.172861 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84493b4d-1972-4ba3-b1ca-e48023412343" path="/var/lib/kubelet/pods/84493b4d-1972-4ba3-b1ca-e48023412343/volumes" Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.173644 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9135ba-f2fb-4bd2-94d8-5c60d6797991" path="/var/lib/kubelet/pods/fd9135ba-f2fb-4bd2-94d8-5c60d6797991/volumes" Feb 17 15:38:05 crc kubenswrapper[4806]: I0217 15:38:05.871964 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn"] Feb 17 15:38:05 crc kubenswrapper[4806]: W0217 15:38:05.875642 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82d53fc8_bccb_41ed_a2f9_bc0bb4f93863.slice/crio-f1c3a66e2b07e64f6dfb1f211bc16dea62251c3515e3acabf544943c84ff12f8 WatchSource:0}: Error finding container f1c3a66e2b07e64f6dfb1f211bc16dea62251c3515e3acabf544943c84ff12f8: Status 404 returned error can't find the container with id f1c3a66e2b07e64f6dfb1f211bc16dea62251c3515e3acabf544943c84ff12f8 Feb 17 15:38:06 crc kubenswrapper[4806]: I0217 15:38:06.119991 4806 generic.go:334] "Generic (PLEG): container finished" podID="5ad740f0-0a0a-4461-8569-e1c4f41663c2" containerID="eea8ded70f093ffd3ebc7eeefc7c38d7915ff70286d2355e120e11c4186b77a4" exitCode=0 Feb 17 15:38:06 crc kubenswrapper[4806]: I0217 15:38:06.120074 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-sg7zz" event={"ID":"5ad740f0-0a0a-4461-8569-e1c4f41663c2","Type":"ContainerDied","Data":"eea8ded70f093ffd3ebc7eeefc7c38d7915ff70286d2355e120e11c4186b77a4"} Feb 17 15:38:06 crc kubenswrapper[4806]: I0217 15:38:06.120107 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-sg7zz" event={"ID":"5ad740f0-0a0a-4461-8569-e1c4f41663c2","Type":"ContainerStarted","Data":"24298b4a78f62d5a4bcb72bf4f5b0392c9db4f2c25aa57fa4f492f384c780ffd"} Feb 17 15:38:06 crc kubenswrapper[4806]: I0217 15:38:06.121868 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" event={"ID":"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863","Type":"ContainerStarted","Data":"325c2d9c4132f5584db41d5eaf4bd4748f5ff01e7fd614a5368b2175270a46e1"} Feb 17 15:38:06 crc kubenswrapper[4806]: I0217 15:38:06.121896 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" event={"ID":"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863","Type":"ContainerStarted","Data":"f1c3a66e2b07e64f6dfb1f211bc16dea62251c3515e3acabf544943c84ff12f8"} Feb 17 15:38:06 crc kubenswrapper[4806]: I0217 15:38:06.155083 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" podStartSLOduration=2.155061374 podStartE2EDuration="2.155061374s" podCreationTimestamp="2026-02-17 15:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:06.153170757 +0000 UTC m=+1047.683801178" watchObservedRunningTime="2026-02-17 15:38:06.155061374 +0000 UTC m=+1047.685691785" Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.137580 4806 generic.go:334] "Generic (PLEG): container finished" podID="82d53fc8-bccb-41ed-a2f9-bc0bb4f93863" containerID="325c2d9c4132f5584db41d5eaf4bd4748f5ff01e7fd614a5368b2175270a46e1" exitCode=0 Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.137733 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" event={"ID":"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863","Type":"ContainerDied","Data":"325c2d9c4132f5584db41d5eaf4bd4748f5ff01e7fd614a5368b2175270a46e1"} Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.434192 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.443898 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ad740f0-0a0a-4461-8569-e1c4f41663c2-operator-scripts\") pod \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.444007 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxx6t\" (UniqueName: \"kubernetes.io/projected/5ad740f0-0a0a-4461-8569-e1c4f41663c2-kube-api-access-zxx6t\") pod \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\" (UID: \"5ad740f0-0a0a-4461-8569-e1c4f41663c2\") " Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.444925 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ad740f0-0a0a-4461-8569-e1c4f41663c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ad740f0-0a0a-4461-8569-e1c4f41663c2" (UID: "5ad740f0-0a0a-4461-8569-e1c4f41663c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.454656 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad740f0-0a0a-4461-8569-e1c4f41663c2-kube-api-access-zxx6t" (OuterVolumeSpecName: "kube-api-access-zxx6t") pod "5ad740f0-0a0a-4461-8569-e1c4f41663c2" (UID: "5ad740f0-0a0a-4461-8569-e1c4f41663c2"). InnerVolumeSpecName "kube-api-access-zxx6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.546073 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ad740f0-0a0a-4461-8569-e1c4f41663c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:07 crc kubenswrapper[4806]: I0217 15:38:07.546107 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxx6t\" (UniqueName: \"kubernetes.io/projected/5ad740f0-0a0a-4461-8569-e1c4f41663c2-kube-api-access-zxx6t\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.146628 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-sg7zz" event={"ID":"5ad740f0-0a0a-4461-8569-e1c4f41663c2","Type":"ContainerDied","Data":"24298b4a78f62d5a4bcb72bf4f5b0392c9db4f2c25aa57fa4f492f384c780ffd"} Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.146657 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-sg7zz" Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.146676 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24298b4a78f62d5a4bcb72bf4f5b0392c9db4f2c25aa57fa4f492f384c780ffd" Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.417795 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.557301 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6gf\" (UniqueName: \"kubernetes.io/projected/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-kube-api-access-pd6gf\") pod \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.557418 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-operator-scripts\") pod \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\" (UID: \"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863\") " Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.558634 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82d53fc8-bccb-41ed-a2f9-bc0bb4f93863" (UID: "82d53fc8-bccb-41ed-a2f9-bc0bb4f93863"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.561630 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-kube-api-access-pd6gf" (OuterVolumeSpecName: "kube-api-access-pd6gf") pod "82d53fc8-bccb-41ed-a2f9-bc0bb4f93863" (UID: "82d53fc8-bccb-41ed-a2f9-bc0bb4f93863"). InnerVolumeSpecName "kube-api-access-pd6gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.658981 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6gf\" (UniqueName: \"kubernetes.io/projected/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-kube-api-access-pd6gf\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:08 crc kubenswrapper[4806]: I0217 15:38:08.659023 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.157217 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" event={"ID":"82d53fc8-bccb-41ed-a2f9-bc0bb4f93863","Type":"ContainerDied","Data":"f1c3a66e2b07e64f6dfb1f211bc16dea62251c3515e3acabf544943c84ff12f8"} Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.157551 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1c3a66e2b07e64f6dfb1f211bc16dea62251c3515e3acabf544943c84ff12f8" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.157314 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.391149 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-5kjsc"] Feb 17 15:38:09 crc kubenswrapper[4806]: E0217 15:38:09.391576 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad740f0-0a0a-4461-8569-e1c4f41663c2" containerName="mariadb-database-create" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.391599 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad740f0-0a0a-4461-8569-e1c4f41663c2" containerName="mariadb-database-create" Feb 17 15:38:09 crc kubenswrapper[4806]: E0217 15:38:09.391614 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d53fc8-bccb-41ed-a2f9-bc0bb4f93863" containerName="mariadb-account-create-update" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.391621 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d53fc8-bccb-41ed-a2f9-bc0bb4f93863" containerName="mariadb-account-create-update" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.391779 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d53fc8-bccb-41ed-a2f9-bc0bb4f93863" containerName="mariadb-account-create-update" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.391798 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad740f0-0a0a-4461-8569-e1c4f41663c2" containerName="mariadb-database-create" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.392357 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.393891 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-rffdj" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.394048 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.394065 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.397731 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5kjsc"] Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.571322 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9twj\" (UniqueName: \"kubernetes.io/projected/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-kube-api-access-w9twj\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.571384 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-config-data\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.571450 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-db-sync-config-data\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.571472 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-combined-ca-bundle\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.673555 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-db-sync-config-data\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.673642 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-combined-ca-bundle\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.673761 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9twj\" (UniqueName: \"kubernetes.io/projected/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-kube-api-access-w9twj\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.673853 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-config-data\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.681234 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-combined-ca-bundle\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.681354 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-config-data\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.681463 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-db-sync-config-data\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:09 crc kubenswrapper[4806]: I0217 15:38:09.711929 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9twj\" (UniqueName: \"kubernetes.io/projected/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-kube-api-access-w9twj\") pod \"glance-db-sync-5kjsc\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:10 crc kubenswrapper[4806]: I0217 15:38:10.009313 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:10 crc kubenswrapper[4806]: I0217 15:38:10.254310 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5kjsc"] Feb 17 15:38:10 crc kubenswrapper[4806]: W0217 15:38:10.261044 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3acd4b_49c7_4d20_b19c_a24e46728e3e.slice/crio-970aac6fd49b850b58b9bd33636fd11d1c696b690aa38ae96de912c8b1ec2722 WatchSource:0}: Error finding container 970aac6fd49b850b58b9bd33636fd11d1c696b690aa38ae96de912c8b1ec2722: Status 404 returned error can't find the container with id 970aac6fd49b850b58b9bd33636fd11d1c696b690aa38ae96de912c8b1ec2722 Feb 17 15:38:11 crc kubenswrapper[4806]: I0217 15:38:11.176099 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5kjsc" event={"ID":"0a3acd4b-49c7-4d20-b19c-a24e46728e3e","Type":"ContainerStarted","Data":"0d414e3f352b4d6c2dd6a65417dd18f087e5bbdcbaa5ef04b4f0e3ff4b613414"} Feb 17 15:38:11 crc kubenswrapper[4806]: I0217 15:38:11.176443 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5kjsc" event={"ID":"0a3acd4b-49c7-4d20-b19c-a24e46728e3e","Type":"ContainerStarted","Data":"970aac6fd49b850b58b9bd33636fd11d1c696b690aa38ae96de912c8b1ec2722"} Feb 17 15:38:11 crc kubenswrapper[4806]: I0217 15:38:11.211391 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-5kjsc" podStartSLOduration=2.211371466 podStartE2EDuration="2.211371466s" podCreationTimestamp="2026-02-17 15:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:11.199994924 +0000 UTC m=+1052.730625365" watchObservedRunningTime="2026-02-17 15:38:11.211371466 +0000 UTC m=+1052.742001887" Feb 17 15:38:14 crc kubenswrapper[4806]: I0217 15:38:14.205965 4806 generic.go:334] "Generic (PLEG): container finished" podID="0a3acd4b-49c7-4d20-b19c-a24e46728e3e" containerID="0d414e3f352b4d6c2dd6a65417dd18f087e5bbdcbaa5ef04b4f0e3ff4b613414" exitCode=0 Feb 17 15:38:14 crc kubenswrapper[4806]: I0217 15:38:14.206077 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5kjsc" event={"ID":"0a3acd4b-49c7-4d20-b19c-a24e46728e3e","Type":"ContainerDied","Data":"0d414e3f352b4d6c2dd6a65417dd18f087e5bbdcbaa5ef04b4f0e3ff4b613414"} Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.516711 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.667684 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-config-data\") pod \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.667799 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9twj\" (UniqueName: \"kubernetes.io/projected/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-kube-api-access-w9twj\") pod \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.667843 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-db-sync-config-data\") pod \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.667891 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-combined-ca-bundle\") pod \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\" (UID: \"0a3acd4b-49c7-4d20-b19c-a24e46728e3e\") " Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.674138 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-kube-api-access-w9twj" (OuterVolumeSpecName: "kube-api-access-w9twj") pod "0a3acd4b-49c7-4d20-b19c-a24e46728e3e" (UID: "0a3acd4b-49c7-4d20-b19c-a24e46728e3e"). InnerVolumeSpecName "kube-api-access-w9twj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.675202 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0a3acd4b-49c7-4d20-b19c-a24e46728e3e" (UID: "0a3acd4b-49c7-4d20-b19c-a24e46728e3e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.698168 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a3acd4b-49c7-4d20-b19c-a24e46728e3e" (UID: "0a3acd4b-49c7-4d20-b19c-a24e46728e3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.709250 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-config-data" (OuterVolumeSpecName: "config-data") pod "0a3acd4b-49c7-4d20-b19c-a24e46728e3e" (UID: "0a3acd4b-49c7-4d20-b19c-a24e46728e3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.770124 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9twj\" (UniqueName: \"kubernetes.io/projected/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-kube-api-access-w9twj\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.770184 4806 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.770198 4806 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:15 crc kubenswrapper[4806]: I0217 15:38:15.770211 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3acd4b-49c7-4d20-b19c-a24e46728e3e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:16 crc kubenswrapper[4806]: I0217 15:38:16.226864 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-5kjsc" event={"ID":"0a3acd4b-49c7-4d20-b19c-a24e46728e3e","Type":"ContainerDied","Data":"970aac6fd49b850b58b9bd33636fd11d1c696b690aa38ae96de912c8b1ec2722"} Feb 17 15:38:16 crc kubenswrapper[4806]: I0217 15:38:16.227489 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970aac6fd49b850b58b9bd33636fd11d1c696b690aa38ae96de912c8b1ec2722" Feb 17 15:38:16 crc kubenswrapper[4806]: I0217 15:38:16.227047 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-5kjsc" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.434300 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:17 crc kubenswrapper[4806]: E0217 15:38:17.434802 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3acd4b-49c7-4d20-b19c-a24e46728e3e" containerName="glance-db-sync" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.434825 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3acd4b-49c7-4d20-b19c-a24e46728e3e" containerName="glance-db-sync" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.435058 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3acd4b-49c7-4d20-b19c-a24e46728e3e" containerName="glance-db-sync" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.436259 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.437774 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.437911 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.438773 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.438795 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.439048 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.439054 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-rffdj" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.454947 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593209 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvjrn\" (UniqueName: \"kubernetes.io/projected/9cb18c89-b9af-453a-ab7c-8d911deebe4a-kube-api-access-rvjrn\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593275 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593324 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-logs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593357 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-scripts\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593384 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593479 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-config-data\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593592 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-httpd-run\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593647 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.593710 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.695474 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.695581 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-logs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.695851 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696096 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-logs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696144 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-scripts\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696191 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696719 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-config-data\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696760 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-httpd-run\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696785 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696822 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.696887 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjrn\" (UniqueName: \"kubernetes.io/projected/9cb18c89-b9af-453a-ab7c-8d911deebe4a-kube-api-access-rvjrn\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.697551 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-httpd-run\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.702446 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-config-data\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.703783 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.704002 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.704653 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.707892 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-scripts\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.718576 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvjrn\" (UniqueName: \"kubernetes.io/projected/9cb18c89-b9af-453a-ab7c-8d911deebe4a-kube-api-access-rvjrn\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.726523 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:17 crc kubenswrapper[4806]: I0217 15:38:17.771174 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:18 crc kubenswrapper[4806]: I0217 15:38:18.219440 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:18 crc kubenswrapper[4806]: I0217 15:38:18.240637 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9cb18c89-b9af-453a-ab7c-8d911deebe4a","Type":"ContainerStarted","Data":"a39f1ab2e835c10bb93d58de719115bd7ab7baefae162b57e92e6cb540d95acd"} Feb 17 15:38:19 crc kubenswrapper[4806]: I0217 15:38:19.253057 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9cb18c89-b9af-453a-ab7c-8d911deebe4a","Type":"ContainerStarted","Data":"d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173"} Feb 17 15:38:20 crc kubenswrapper[4806]: I0217 15:38:20.263601 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9cb18c89-b9af-453a-ab7c-8d911deebe4a","Type":"ContainerStarted","Data":"8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e"} Feb 17 15:38:20 crc kubenswrapper[4806]: I0217 15:38:20.298078 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.298053213 podStartE2EDuration="3.298053213s" podCreationTimestamp="2026-02-17 15:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:20.284666432 +0000 UTC m=+1061.815296923" watchObservedRunningTime="2026-02-17 15:38:20.298053213 +0000 UTC m=+1061.828683624" Feb 17 15:38:27 crc kubenswrapper[4806]: I0217 15:38:27.771477 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:27 crc kubenswrapper[4806]: I0217 15:38:27.772082 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:27 crc kubenswrapper[4806]: I0217 15:38:27.802998 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:27 crc kubenswrapper[4806]: I0217 15:38:27.816095 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:28 crc kubenswrapper[4806]: I0217 15:38:28.337027 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:28 crc kubenswrapper[4806]: I0217 15:38:28.337440 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:30 crc kubenswrapper[4806]: I0217 15:38:30.399395 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:30 crc kubenswrapper[4806]: I0217 15:38:30.399628 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:38:30 crc kubenswrapper[4806]: I0217 15:38:30.401456 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.143277 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5kjsc"] Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.150044 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-5kjsc"] Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.195333 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3ba2-account-delete-cpctq"] Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.196434 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.212571 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3ba2-account-delete-cpctq"] Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.314628 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.325740 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9db361b3-6a89-4436-aa66-0fe5464e1ee6-operator-scripts\") pod \"glance3ba2-account-delete-cpctq\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.325806 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85jzx\" (UniqueName: \"kubernetes.io/projected/9db361b3-6a89-4436-aa66-0fe5464e1ee6-kube-api-access-85jzx\") pod \"glance3ba2-account-delete-cpctq\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.370345 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-log" containerID="cri-o://d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173" gracePeriod=30 Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.370447 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-httpd" containerID="cri-o://8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e" gracePeriod=30 Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.427819 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9db361b3-6a89-4436-aa66-0fe5464e1ee6-operator-scripts\") pod \"glance3ba2-account-delete-cpctq\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.427912 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85jzx\" (UniqueName: \"kubernetes.io/projected/9db361b3-6a89-4436-aa66-0fe5464e1ee6-kube-api-access-85jzx\") pod \"glance3ba2-account-delete-cpctq\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.429245 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9db361b3-6a89-4436-aa66-0fe5464e1ee6-operator-scripts\") pod \"glance3ba2-account-delete-cpctq\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.449765 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85jzx\" (UniqueName: \"kubernetes.io/projected/9db361b3-6a89-4436-aa66-0fe5464e1ee6-kube-api-access-85jzx\") pod \"glance3ba2-account-delete-cpctq\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:32 crc kubenswrapper[4806]: I0217 15:38:32.568209 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:33 crc kubenswrapper[4806]: I0217 15:38:33.021040 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3ba2-account-delete-cpctq"] Feb 17 15:38:33 crc kubenswrapper[4806]: I0217 15:38:33.172201 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3acd4b-49c7-4d20-b19c-a24e46728e3e" path="/var/lib/kubelet/pods/0a3acd4b-49c7-4d20-b19c-a24e46728e3e/volumes" Feb 17 15:38:33 crc kubenswrapper[4806]: I0217 15:38:33.379138 4806 generic.go:334] "Generic (PLEG): container finished" podID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerID="d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173" exitCode=143 Feb 17 15:38:33 crc kubenswrapper[4806]: I0217 15:38:33.379261 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9cb18c89-b9af-453a-ab7c-8d911deebe4a","Type":"ContainerDied","Data":"d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173"} Feb 17 15:38:33 crc kubenswrapper[4806]: I0217 15:38:33.381694 4806 generic.go:334] "Generic (PLEG): container finished" podID="9db361b3-6a89-4436-aa66-0fe5464e1ee6" containerID="136fa9edc2ab43ff9711cbe9b2702924316c123a25f4c5ae19568b0d9f6dd1fe" exitCode=0 Feb 17 15:38:33 crc kubenswrapper[4806]: I0217 15:38:33.381727 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" event={"ID":"9db361b3-6a89-4436-aa66-0fe5464e1ee6","Type":"ContainerDied","Data":"136fa9edc2ab43ff9711cbe9b2702924316c123a25f4c5ae19568b0d9f6dd1fe"} Feb 17 15:38:33 crc kubenswrapper[4806]: I0217 15:38:33.381753 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" event={"ID":"9db361b3-6a89-4436-aa66-0fe5464e1ee6","Type":"ContainerStarted","Data":"a46262e0131cdc1f9c4c43cda1aa7b37a69fb60a82002052262ae53e683d9beb"} Feb 17 15:38:34 crc kubenswrapper[4806]: I0217 15:38:34.734858 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:34 crc kubenswrapper[4806]: I0217 15:38:34.869184 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85jzx\" (UniqueName: \"kubernetes.io/projected/9db361b3-6a89-4436-aa66-0fe5464e1ee6-kube-api-access-85jzx\") pod \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " Feb 17 15:38:34 crc kubenswrapper[4806]: I0217 15:38:34.869744 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9db361b3-6a89-4436-aa66-0fe5464e1ee6-operator-scripts\") pod \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\" (UID: \"9db361b3-6a89-4436-aa66-0fe5464e1ee6\") " Feb 17 15:38:34 crc kubenswrapper[4806]: I0217 15:38:34.870592 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db361b3-6a89-4436-aa66-0fe5464e1ee6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9db361b3-6a89-4436-aa66-0fe5464e1ee6" (UID: "9db361b3-6a89-4436-aa66-0fe5464e1ee6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:38:34 crc kubenswrapper[4806]: I0217 15:38:34.879501 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db361b3-6a89-4436-aa66-0fe5464e1ee6-kube-api-access-85jzx" (OuterVolumeSpecName: "kube-api-access-85jzx") pod "9db361b3-6a89-4436-aa66-0fe5464e1ee6" (UID: "9db361b3-6a89-4436-aa66-0fe5464e1ee6"). InnerVolumeSpecName "kube-api-access-85jzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:34 crc kubenswrapper[4806]: I0217 15:38:34.973379 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9db361b3-6a89-4436-aa66-0fe5464e1ee6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:34 crc kubenswrapper[4806]: I0217 15:38:34.973442 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85jzx\" (UniqueName: \"kubernetes.io/projected/9db361b3-6a89-4436-aa66-0fe5464e1ee6-kube-api-access-85jzx\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.400834 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" event={"ID":"9db361b3-6a89-4436-aa66-0fe5464e1ee6","Type":"ContainerDied","Data":"a46262e0131cdc1f9c4c43cda1aa7b37a69fb60a82002052262ae53e683d9beb"} Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.400896 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46262e0131cdc1f9c4c43cda1aa7b37a69fb60a82002052262ae53e683d9beb" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.400972 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3ba2-account-delete-cpctq" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.881913 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.987814 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.987905 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-public-tls-certs\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.987934 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-scripts\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.988001 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvjrn\" (UniqueName: \"kubernetes.io/projected/9cb18c89-b9af-453a-ab7c-8d911deebe4a-kube-api-access-rvjrn\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.988045 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-config-data\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.988067 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-logs\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.988145 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-httpd-run\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.988179 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-combined-ca-bundle\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.988228 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-internal-tls-certs\") pod \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\" (UID: \"9cb18c89-b9af-453a-ab7c-8d911deebe4a\") " Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.989545 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.990485 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-logs" (OuterVolumeSpecName: "logs") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.994744 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.994844 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb18c89-b9af-453a-ab7c-8d911deebe4a-kube-api-access-rvjrn" (OuterVolumeSpecName: "kube-api-access-rvjrn") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "kube-api-access-rvjrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:35 crc kubenswrapper[4806]: I0217 15:38:35.998314 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-scripts" (OuterVolumeSpecName: "scripts") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.014580 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.023336 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-config-data" (OuterVolumeSpecName: "config-data") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.025825 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.031076 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9cb18c89-b9af-453a-ab7c-8d911deebe4a" (UID: "9cb18c89-b9af-453a-ab7c-8d911deebe4a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089683 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089722 4806 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089735 4806 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089770 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089781 4806 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089790 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089800 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvjrn\" (UniqueName: \"kubernetes.io/projected/9cb18c89-b9af-453a-ab7c-8d911deebe4a-kube-api-access-rvjrn\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089810 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb18c89-b9af-453a-ab7c-8d911deebe4a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.089820 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cb18c89-b9af-453a-ab7c-8d911deebe4a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.105717 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.190864 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.414089 4806 generic.go:334] "Generic (PLEG): container finished" podID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerID="8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e" exitCode=0 Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.414146 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9cb18c89-b9af-453a-ab7c-8d911deebe4a","Type":"ContainerDied","Data":"8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e"} Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.414188 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9cb18c89-b9af-453a-ab7c-8d911deebe4a","Type":"ContainerDied","Data":"a39f1ab2e835c10bb93d58de719115bd7ab7baefae162b57e92e6cb540d95acd"} Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.414213 4806 scope.go:117] "RemoveContainer" containerID="8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.414230 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.458035 4806 scope.go:117] "RemoveContainer" containerID="d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.467152 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.476254 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.497679 4806 scope.go:117] "RemoveContainer" containerID="8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e" Feb 17 15:38:36 crc kubenswrapper[4806]: E0217 15:38:36.498788 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e\": container with ID starting with 8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e not found: ID does not exist" containerID="8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.498855 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e"} err="failed to get container status \"8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e\": rpc error: code = NotFound desc = could not find container \"8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e\": container with ID starting with 8ef3760dffc6c517890a9c57a0fea7d72d15c32462dc6e9789eb724f7251f41e not found: ID does not exist" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.498895 4806 scope.go:117] "RemoveContainer" containerID="d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173" Feb 17 15:38:36 crc kubenswrapper[4806]: E0217 15:38:36.499446 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173\": container with ID starting with d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173 not found: ID does not exist" containerID="d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173" Feb 17 15:38:36 crc kubenswrapper[4806]: I0217 15:38:36.499509 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173"} err="failed to get container status \"d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173\": rpc error: code = NotFound desc = could not find container \"d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173\": container with ID starting with d6e49ac9f9e24e1ca877655ab00c5c9f67abbcd01c20ff0434307ef815cab173 not found: ID does not exist" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.177594 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" path="/var/lib/kubelet/pods/9cb18c89-b9af-453a-ab7c-8d911deebe4a/volumes" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.213962 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-sg7zz"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.220611 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-sg7zz"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.234157 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3ba2-account-delete-cpctq"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.241335 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.247195 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3ba2-account-create-update-8xsdn"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.252876 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3ba2-account-delete-cpctq"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.440547 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-blg27"] Feb 17 15:38:37 crc kubenswrapper[4806]: E0217 15:38:37.440946 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-httpd" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.440967 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-httpd" Feb 17 15:38:37 crc kubenswrapper[4806]: E0217 15:38:37.440984 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-log" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.440991 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-log" Feb 17 15:38:37 crc kubenswrapper[4806]: E0217 15:38:37.441018 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db361b3-6a89-4436-aa66-0fe5464e1ee6" containerName="mariadb-account-delete" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.441029 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db361b3-6a89-4436-aa66-0fe5464e1ee6" containerName="mariadb-account-delete" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.441163 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-httpd" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.441178 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db361b3-6a89-4436-aa66-0fe5464e1ee6" containerName="mariadb-account-delete" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.441188 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb18c89-b9af-453a-ab7c-8d911deebe4a" containerName="glance-log" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.441739 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.449515 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-blg27"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.469670 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-f285-account-create-update-vrq6c"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.470608 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.472502 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.491131 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f285-account-create-update-vrq6c"] Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.514639 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrhz\" (UniqueName: \"kubernetes.io/projected/137fdb5e-a134-487c-bf9f-19991fdf35f3-kube-api-access-jtrhz\") pod \"glance-f285-account-create-update-vrq6c\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.514707 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-operator-scripts\") pod \"glance-db-create-blg27\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.514838 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbrc5\" (UniqueName: \"kubernetes.io/projected/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-kube-api-access-bbrc5\") pod \"glance-db-create-blg27\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.514882 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137fdb5e-a134-487c-bf9f-19991fdf35f3-operator-scripts\") pod \"glance-f285-account-create-update-vrq6c\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.616026 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrhz\" (UniqueName: \"kubernetes.io/projected/137fdb5e-a134-487c-bf9f-19991fdf35f3-kube-api-access-jtrhz\") pod \"glance-f285-account-create-update-vrq6c\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.616093 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-operator-scripts\") pod \"glance-db-create-blg27\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.616151 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbrc5\" (UniqueName: \"kubernetes.io/projected/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-kube-api-access-bbrc5\") pod \"glance-db-create-blg27\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.616174 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137fdb5e-a134-487c-bf9f-19991fdf35f3-operator-scripts\") pod \"glance-f285-account-create-update-vrq6c\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.617131 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137fdb5e-a134-487c-bf9f-19991fdf35f3-operator-scripts\") pod \"glance-f285-account-create-update-vrq6c\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.617792 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-operator-scripts\") pod \"glance-db-create-blg27\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.636254 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrhz\" (UniqueName: \"kubernetes.io/projected/137fdb5e-a134-487c-bf9f-19991fdf35f3-kube-api-access-jtrhz\") pod \"glance-f285-account-create-update-vrq6c\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.638964 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbrc5\" (UniqueName: \"kubernetes.io/projected/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-kube-api-access-bbrc5\") pod \"glance-db-create-blg27\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.762839 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:37 crc kubenswrapper[4806]: I0217 15:38:37.792990 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.220327 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-blg27"] Feb 17 15:38:38 crc kubenswrapper[4806]: W0217 15:38:38.223007 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod953bc43c_2b9c_4d7f_b38d_c0d35a394a6b.slice/crio-e7f24ef0089ca515e53fe725754f88f2c27da7180754ddcfce6799b276054c26 WatchSource:0}: Error finding container e7f24ef0089ca515e53fe725754f88f2c27da7180754ddcfce6799b276054c26: Status 404 returned error can't find the container with id e7f24ef0089ca515e53fe725754f88f2c27da7180754ddcfce6799b276054c26 Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.267957 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-f285-account-create-update-vrq6c"] Feb 17 15:38:38 crc kubenswrapper[4806]: W0217 15:38:38.272620 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137fdb5e_a134_487c_bf9f_19991fdf35f3.slice/crio-85c8a2e0f5f1c2d45d9e5bea663e723a9e7b1b9df97cbcf9b30c15324d839391 WatchSource:0}: Error finding container 85c8a2e0f5f1c2d45d9e5bea663e723a9e7b1b9df97cbcf9b30c15324d839391: Status 404 returned error can't find the container with id 85c8a2e0f5f1c2d45d9e5bea663e723a9e7b1b9df97cbcf9b30c15324d839391 Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.435693 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-blg27" event={"ID":"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b","Type":"ContainerStarted","Data":"42ef9acdb2fb43af5b82233be748b8433d5161205fdb8c32cf7a4028002480a6"} Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.435751 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-blg27" event={"ID":"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b","Type":"ContainerStarted","Data":"e7f24ef0089ca515e53fe725754f88f2c27da7180754ddcfce6799b276054c26"} Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.437655 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" event={"ID":"137fdb5e-a134-487c-bf9f-19991fdf35f3","Type":"ContainerStarted","Data":"871b270ca2c1250f908eb66371c899e5cf8d9f41a0fec922079133f56ba0bad2"} Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.437716 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" event={"ID":"137fdb5e-a134-487c-bf9f-19991fdf35f3","Type":"ContainerStarted","Data":"85c8a2e0f5f1c2d45d9e5bea663e723a9e7b1b9df97cbcf9b30c15324d839391"} Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.452842 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-create-blg27" podStartSLOduration=1.452818861 podStartE2EDuration="1.452818861s" podCreationTimestamp="2026-02-17 15:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:38.45156989 +0000 UTC m=+1079.982200341" watchObservedRunningTime="2026-02-17 15:38:38.452818861 +0000 UTC m=+1079.983449272" Feb 17 15:38:38 crc kubenswrapper[4806]: I0217 15:38:38.471858 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" podStartSLOduration=1.471835862 podStartE2EDuration="1.471835862s" podCreationTimestamp="2026-02-17 15:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:38.464112801 +0000 UTC m=+1079.994743212" watchObservedRunningTime="2026-02-17 15:38:38.471835862 +0000 UTC m=+1080.002466273" Feb 17 15:38:39 crc kubenswrapper[4806]: I0217 15:38:39.173984 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad740f0-0a0a-4461-8569-e1c4f41663c2" path="/var/lib/kubelet/pods/5ad740f0-0a0a-4461-8569-e1c4f41663c2/volumes" Feb 17 15:38:39 crc kubenswrapper[4806]: I0217 15:38:39.175133 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d53fc8-bccb-41ed-a2f9-bc0bb4f93863" path="/var/lib/kubelet/pods/82d53fc8-bccb-41ed-a2f9-bc0bb4f93863/volumes" Feb 17 15:38:39 crc kubenswrapper[4806]: I0217 15:38:39.175967 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db361b3-6a89-4436-aa66-0fe5464e1ee6" path="/var/lib/kubelet/pods/9db361b3-6a89-4436-aa66-0fe5464e1ee6/volumes" Feb 17 15:38:39 crc kubenswrapper[4806]: I0217 15:38:39.448128 4806 generic.go:334] "Generic (PLEG): container finished" podID="137fdb5e-a134-487c-bf9f-19991fdf35f3" containerID="871b270ca2c1250f908eb66371c899e5cf8d9f41a0fec922079133f56ba0bad2" exitCode=0 Feb 17 15:38:39 crc kubenswrapper[4806]: I0217 15:38:39.448236 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" event={"ID":"137fdb5e-a134-487c-bf9f-19991fdf35f3","Type":"ContainerDied","Data":"871b270ca2c1250f908eb66371c899e5cf8d9f41a0fec922079133f56ba0bad2"} Feb 17 15:38:39 crc kubenswrapper[4806]: I0217 15:38:39.449964 4806 generic.go:334] "Generic (PLEG): container finished" podID="953bc43c-2b9c-4d7f-b38d-c0d35a394a6b" containerID="42ef9acdb2fb43af5b82233be748b8433d5161205fdb8c32cf7a4028002480a6" exitCode=0 Feb 17 15:38:39 crc kubenswrapper[4806]: I0217 15:38:39.450025 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-blg27" event={"ID":"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b","Type":"ContainerDied","Data":"42ef9acdb2fb43af5b82233be748b8433d5161205fdb8c32cf7a4028002480a6"} Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.854583 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.859651 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.969100 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtrhz\" (UniqueName: \"kubernetes.io/projected/137fdb5e-a134-487c-bf9f-19991fdf35f3-kube-api-access-jtrhz\") pod \"137fdb5e-a134-487c-bf9f-19991fdf35f3\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.970837 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbrc5\" (UniqueName: \"kubernetes.io/projected/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-kube-api-access-bbrc5\") pod \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.970983 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-operator-scripts\") pod \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\" (UID: \"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b\") " Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.971059 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137fdb5e-a134-487c-bf9f-19991fdf35f3-operator-scripts\") pod \"137fdb5e-a134-487c-bf9f-19991fdf35f3\" (UID: \"137fdb5e-a134-487c-bf9f-19991fdf35f3\") " Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.971845 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "953bc43c-2b9c-4d7f-b38d-c0d35a394a6b" (UID: "953bc43c-2b9c-4d7f-b38d-c0d35a394a6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.972047 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/137fdb5e-a134-487c-bf9f-19991fdf35f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "137fdb5e-a134-487c-bf9f-19991fdf35f3" (UID: "137fdb5e-a134-487c-bf9f-19991fdf35f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.977603 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/137fdb5e-a134-487c-bf9f-19991fdf35f3-kube-api-access-jtrhz" (OuterVolumeSpecName: "kube-api-access-jtrhz") pod "137fdb5e-a134-487c-bf9f-19991fdf35f3" (UID: "137fdb5e-a134-487c-bf9f-19991fdf35f3"). InnerVolumeSpecName "kube-api-access-jtrhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:40 crc kubenswrapper[4806]: I0217 15:38:40.978576 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-kube-api-access-bbrc5" (OuterVolumeSpecName: "kube-api-access-bbrc5") pod "953bc43c-2b9c-4d7f-b38d-c0d35a394a6b" (UID: "953bc43c-2b9c-4d7f-b38d-c0d35a394a6b"). InnerVolumeSpecName "kube-api-access-bbrc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.073426 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbrc5\" (UniqueName: \"kubernetes.io/projected/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-kube-api-access-bbrc5\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.073475 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.073488 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/137fdb5e-a134-487c-bf9f-19991fdf35f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.073501 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtrhz\" (UniqueName: \"kubernetes.io/projected/137fdb5e-a134-487c-bf9f-19991fdf35f3-kube-api-access-jtrhz\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.480671 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-blg27" event={"ID":"953bc43c-2b9c-4d7f-b38d-c0d35a394a6b","Type":"ContainerDied","Data":"e7f24ef0089ca515e53fe725754f88f2c27da7180754ddcfce6799b276054c26"} Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.480721 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7f24ef0089ca515e53fe725754f88f2c27da7180754ddcfce6799b276054c26" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.481075 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-blg27" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.482169 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" event={"ID":"137fdb5e-a134-487c-bf9f-19991fdf35f3","Type":"ContainerDied","Data":"85c8a2e0f5f1c2d45d9e5bea663e723a9e7b1b9df97cbcf9b30c15324d839391"} Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.482191 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c8a2e0f5f1c2d45d9e5bea663e723a9e7b1b9df97cbcf9b30c15324d839391" Feb 17 15:38:41 crc kubenswrapper[4806]: I0217 15:38:41.482231 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-f285-account-create-update-vrq6c" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.663704 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-h5nmw"] Feb 17 15:38:42 crc kubenswrapper[4806]: E0217 15:38:42.664320 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="137fdb5e-a134-487c-bf9f-19991fdf35f3" containerName="mariadb-account-create-update" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.664336 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="137fdb5e-a134-487c-bf9f-19991fdf35f3" containerName="mariadb-account-create-update" Feb 17 15:38:42 crc kubenswrapper[4806]: E0217 15:38:42.664362 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953bc43c-2b9c-4d7f-b38d-c0d35a394a6b" containerName="mariadb-database-create" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.664370 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="953bc43c-2b9c-4d7f-b38d-c0d35a394a6b" containerName="mariadb-database-create" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.664529 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="953bc43c-2b9c-4d7f-b38d-c0d35a394a6b" containerName="mariadb-database-create" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.664553 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="137fdb5e-a134-487c-bf9f-19991fdf35f3" containerName="mariadb-account-create-update" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.665085 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.667258 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-sjlv5" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.671638 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.693202 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-h5nmw"] Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.698620 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-db-sync-config-data\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.698694 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh7vj\" (UniqueName: \"kubernetes.io/projected/71c699d8-9f7e-4f32-822e-3e5f0c367100-kube-api-access-rh7vj\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.698749 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-config-data\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.799626 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-db-sync-config-data\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.799684 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh7vj\" (UniqueName: \"kubernetes.io/projected/71c699d8-9f7e-4f32-822e-3e5f0c367100-kube-api-access-rh7vj\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.799721 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-config-data\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.804312 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-db-sync-config-data\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.804331 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-config-data\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.831580 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh7vj\" (UniqueName: \"kubernetes.io/projected/71c699d8-9f7e-4f32-822e-3e5f0c367100-kube-api-access-rh7vj\") pod \"glance-db-sync-h5nmw\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:42 crc kubenswrapper[4806]: I0217 15:38:42.992531 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:43 crc kubenswrapper[4806]: I0217 15:38:43.218563 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-h5nmw"] Feb 17 15:38:43 crc kubenswrapper[4806]: I0217 15:38:43.499162 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-h5nmw" event={"ID":"71c699d8-9f7e-4f32-822e-3e5f0c367100","Type":"ContainerStarted","Data":"2d6042b2cbadaf7bda26aa3d34e3e18236b35973b90aaad3736143776bac83c6"} Feb 17 15:38:44 crc kubenswrapper[4806]: I0217 15:38:44.508678 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-h5nmw" event={"ID":"71c699d8-9f7e-4f32-822e-3e5f0c367100","Type":"ContainerStarted","Data":"d8c949022626a671c3a88a99670a89d70eb2b8533a9641718ba4f5d37de884c4"} Feb 17 15:38:44 crc kubenswrapper[4806]: I0217 15:38:44.533136 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-h5nmw" podStartSLOduration=2.533117398 podStartE2EDuration="2.533117398s" podCreationTimestamp="2026-02-17 15:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:44.5267391 +0000 UTC m=+1086.057369531" watchObservedRunningTime="2026-02-17 15:38:44.533117398 +0000 UTC m=+1086.063747809" Feb 17 15:38:47 crc kubenswrapper[4806]: I0217 15:38:47.539920 4806 generic.go:334] "Generic (PLEG): container finished" podID="71c699d8-9f7e-4f32-822e-3e5f0c367100" containerID="d8c949022626a671c3a88a99670a89d70eb2b8533a9641718ba4f5d37de884c4" exitCode=0 Feb 17 15:38:47 crc kubenswrapper[4806]: I0217 15:38:47.540035 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-h5nmw" event={"ID":"71c699d8-9f7e-4f32-822e-3e5f0c367100","Type":"ContainerDied","Data":"d8c949022626a671c3a88a99670a89d70eb2b8533a9641718ba4f5d37de884c4"} Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.839688 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.891124 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-db-sync-config-data\") pod \"71c699d8-9f7e-4f32-822e-3e5f0c367100\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.891298 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-config-data\") pod \"71c699d8-9f7e-4f32-822e-3e5f0c367100\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.891423 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh7vj\" (UniqueName: \"kubernetes.io/projected/71c699d8-9f7e-4f32-822e-3e5f0c367100-kube-api-access-rh7vj\") pod \"71c699d8-9f7e-4f32-822e-3e5f0c367100\" (UID: \"71c699d8-9f7e-4f32-822e-3e5f0c367100\") " Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.898565 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "71c699d8-9f7e-4f32-822e-3e5f0c367100" (UID: "71c699d8-9f7e-4f32-822e-3e5f0c367100"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.898655 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c699d8-9f7e-4f32-822e-3e5f0c367100-kube-api-access-rh7vj" (OuterVolumeSpecName: "kube-api-access-rh7vj") pod "71c699d8-9f7e-4f32-822e-3e5f0c367100" (UID: "71c699d8-9f7e-4f32-822e-3e5f0c367100"). InnerVolumeSpecName "kube-api-access-rh7vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.948920 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-config-data" (OuterVolumeSpecName: "config-data") pod "71c699d8-9f7e-4f32-822e-3e5f0c367100" (UID: "71c699d8-9f7e-4f32-822e-3e5f0c367100"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.993907 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.993950 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh7vj\" (UniqueName: \"kubernetes.io/projected/71c699d8-9f7e-4f32-822e-3e5f0c367100-kube-api-access-rh7vj\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:48 crc kubenswrapper[4806]: I0217 15:38:48.993965 4806 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/71c699d8-9f7e-4f32-822e-3e5f0c367100-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.560703 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-h5nmw" event={"ID":"71c699d8-9f7e-4f32-822e-3e5f0c367100","Type":"ContainerDied","Data":"2d6042b2cbadaf7bda26aa3d34e3e18236b35973b90aaad3736143776bac83c6"} Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.560996 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6042b2cbadaf7bda26aa3d34e3e18236b35973b90aaad3736143776bac83c6" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.560846 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-h5nmw" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.944917 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:49 crc kubenswrapper[4806]: E0217 15:38:49.945298 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c699d8-9f7e-4f32-822e-3e5f0c367100" containerName="glance-db-sync" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.945315 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c699d8-9f7e-4f32-822e-3e5f0c367100" containerName="glance-db-sync" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.945494 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c699d8-9f7e-4f32-822e-3e5f0c367100" containerName="glance-db-sync" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.946692 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.949512 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.949822 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-sjlv5" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.950701 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 15:38:49 crc kubenswrapper[4806]: I0217 15:38:49.961141 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009366 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009482 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-scripts\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009517 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009532 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-nvme\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009555 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-logs\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009626 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-sys\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009648 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-lib-modules\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009680 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vwm\" (UniqueName: \"kubernetes.io/projected/073c829c-e361-48fb-9e1f-d39bc43ff752-kube-api-access-68vwm\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009701 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-run\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009728 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-config-data\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.009949 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.010010 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-dev\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.010043 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.010138 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-httpd-run\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111517 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-sys\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111589 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-lib-modules\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111625 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vwm\" (UniqueName: \"kubernetes.io/projected/073c829c-e361-48fb-9e1f-d39bc43ff752-kube-api-access-68vwm\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111640 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-run\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111658 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-config-data\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111683 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-dev\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111680 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-sys\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111732 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111770 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-lib-modules\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.111696 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112107 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-run\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112130 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112230 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-httpd-run\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112455 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112474 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-scripts\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112525 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112542 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-nvme\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112578 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-logs\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112608 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.112482 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-dev\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.113089 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-nvme\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.113085 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.113137 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.113303 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-httpd-run\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.113707 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-logs\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.126962 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-scripts\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.127041 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-config-data\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.132205 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vwm\" (UniqueName: \"kubernetes.io/projected/073c829c-e361-48fb-9e1f-d39bc43ff752-kube-api-access-68vwm\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.138871 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.141051 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.261506 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.503732 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.570027 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerStarted","Data":"d3414e3812a12b4cfff3b8ab4d8d54706e4778c79d3819d185784c550484bcb8"} Feb 17 15:38:50 crc kubenswrapper[4806]: I0217 15:38:50.836526 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:51 crc kubenswrapper[4806]: I0217 15:38:51.578954 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerStarted","Data":"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc"} Feb 17 15:38:51 crc kubenswrapper[4806]: I0217 15:38:51.579352 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerStarted","Data":"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1"} Feb 17 15:38:51 crc kubenswrapper[4806]: I0217 15:38:51.579364 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerStarted","Data":"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b"} Feb 17 15:38:51 crc kubenswrapper[4806]: I0217 15:38:51.579263 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-log" containerID="cri-o://7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b" gracePeriod=30 Feb 17 15:38:51 crc kubenswrapper[4806]: I0217 15:38:51.579569 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-api" containerID="cri-o://4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc" gracePeriod=30 Feb 17 15:38:51 crc kubenswrapper[4806]: I0217 15:38:51.579651 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-httpd" containerID="cri-o://205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1" gracePeriod=30 Feb 17 15:38:51 crc kubenswrapper[4806]: I0217 15:38:51.620663 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.620648344 podStartE2EDuration="2.620648344s" podCreationTimestamp="2026-02-17 15:38:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:51.615830905 +0000 UTC m=+1093.146461316" watchObservedRunningTime="2026-02-17 15:38:51.620648344 +0000 UTC m=+1093.151278755" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.006289 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043254 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-var-locks-brick\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043298 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vwm\" (UniqueName: \"kubernetes.io/projected/073c829c-e361-48fb-9e1f-d39bc43ff752-kube-api-access-68vwm\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043332 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-iscsi\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043360 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-sys\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043380 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-config-data\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043423 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-nvme\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043442 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-lib-modules\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043455 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-run\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043473 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043496 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-logs\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043509 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043543 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-dev\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043599 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-scripts\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.043636 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-httpd-run\") pod \"073c829c-e361-48fb-9e1f-d39bc43ff752\" (UID: \"073c829c-e361-48fb-9e1f-d39bc43ff752\") " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.044113 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.044154 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.044558 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.044627 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-sys" (OuterVolumeSpecName: "sys") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.044957 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.045001 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.045028 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-run" (OuterVolumeSpecName: "run") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.046931 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-logs" (OuterVolumeSpecName: "logs") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.047431 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-dev" (OuterVolumeSpecName: "dev") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.052690 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.052838 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-scripts" (OuterVolumeSpecName: "scripts") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.053519 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073c829c-e361-48fb-9e1f-d39bc43ff752-kube-api-access-68vwm" (OuterVolumeSpecName: "kube-api-access-68vwm") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "kube-api-access-68vwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.054661 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.121715 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-config-data" (OuterVolumeSpecName: "config-data") pod "073c829c-e361-48fb-9e1f-d39bc43ff752" (UID: "073c829c-e361-48fb-9e1f-d39bc43ff752"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145278 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145385 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145428 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vwm\" (UniqueName: \"kubernetes.io/projected/073c829c-e361-48fb-9e1f-d39bc43ff752-kube-api-access-68vwm\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145447 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145463 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145478 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145488 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073c829c-e361-48fb-9e1f-d39bc43ff752-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145498 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145508 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145518 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145575 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145593 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145606 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073c829c-e361-48fb-9e1f-d39bc43ff752-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.145617 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/073c829c-e361-48fb-9e1f-d39bc43ff752-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.159439 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.164188 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.247772 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.247964 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591278 4806 generic.go:334] "Generic (PLEG): container finished" podID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerID="4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc" exitCode=143 Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591321 4806 generic.go:334] "Generic (PLEG): container finished" podID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerID="205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1" exitCode=143 Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591330 4806 generic.go:334] "Generic (PLEG): container finished" podID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerID="7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b" exitCode=143 Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591356 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerDied","Data":"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc"} Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591392 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerDied","Data":"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1"} Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591419 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerDied","Data":"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b"} Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591430 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"073c829c-e361-48fb-9e1f-d39bc43ff752","Type":"ContainerDied","Data":"d3414e3812a12b4cfff3b8ab4d8d54706e4778c79d3819d185784c550484bcb8"} Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591451 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.591451 4806 scope.go:117] "RemoveContainer" containerID="4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.622936 4806 scope.go:117] "RemoveContainer" containerID="205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.625723 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.652026 4806 scope.go:117] "RemoveContainer" containerID="7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.654714 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.687322 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:52 crc kubenswrapper[4806]: E0217 15:38:52.687743 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-api" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.687764 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-api" Feb 17 15:38:52 crc kubenswrapper[4806]: E0217 15:38:52.687777 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-httpd" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.687785 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-httpd" Feb 17 15:38:52 crc kubenswrapper[4806]: E0217 15:38:52.687806 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-log" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.687814 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-log" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.687932 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-api" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.687949 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-log" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.687960 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" containerName="glance-httpd" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.690040 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.690913 4806 scope.go:117] "RemoveContainer" containerID="4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc" Feb 17 15:38:52 crc kubenswrapper[4806]: E0217 15:38:52.691553 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc\": container with ID starting with 4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc not found: ID does not exist" containerID="4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.691579 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc"} err="failed to get container status \"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc\": rpc error: code = NotFound desc = could not find container \"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc\": container with ID starting with 4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.691603 4806 scope.go:117] "RemoveContainer" containerID="205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.694059 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.694429 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-sjlv5" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.694579 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.696909 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:52 crc kubenswrapper[4806]: E0217 15:38:52.697758 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1\": container with ID starting with 205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1 not found: ID does not exist" containerID="205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.697786 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1"} err="failed to get container status \"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1\": rpc error: code = NotFound desc = could not find container \"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1\": container with ID starting with 205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1 not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.697802 4806 scope.go:117] "RemoveContainer" containerID="7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b" Feb 17 15:38:52 crc kubenswrapper[4806]: E0217 15:38:52.698280 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b\": container with ID starting with 7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b not found: ID does not exist" containerID="7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.698318 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b"} err="failed to get container status \"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b\": rpc error: code = NotFound desc = could not find container \"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b\": container with ID starting with 7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.698348 4806 scope.go:117] "RemoveContainer" containerID="4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.699013 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc"} err="failed to get container status \"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc\": rpc error: code = NotFound desc = could not find container \"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc\": container with ID starting with 4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.699039 4806 scope.go:117] "RemoveContainer" containerID="205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.699542 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1"} err="failed to get container status \"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1\": rpc error: code = NotFound desc = could not find container \"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1\": container with ID starting with 205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1 not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.699619 4806 scope.go:117] "RemoveContainer" containerID="7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.700030 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b"} err="failed to get container status \"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b\": rpc error: code = NotFound desc = could not find container \"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b\": container with ID starting with 7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.700058 4806 scope.go:117] "RemoveContainer" containerID="4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.700315 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc"} err="failed to get container status \"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc\": rpc error: code = NotFound desc = could not find container \"4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc\": container with ID starting with 4223448befe5f2e03bbd57a0e326f30c589f4850e6fbd0c70eae82ca82fc5bbc not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.700340 4806 scope.go:117] "RemoveContainer" containerID="205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.700777 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1"} err="failed to get container status \"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1\": rpc error: code = NotFound desc = could not find container \"205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1\": container with ID starting with 205e6d91535910afe50903caee9740211177997bcfc913786f589eea4995f1d1 not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.700832 4806 scope.go:117] "RemoveContainer" containerID="7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.701245 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b"} err="failed to get container status \"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b\": rpc error: code = NotFound desc = could not find container \"7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b\": container with ID starting with 7db3e9356c60d77a68629e222e80917f396cec1f46d10e7c9821bd3c206a4d1b not found: ID does not exist" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.858578 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-config-data\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.858976 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-run\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859010 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-lib-modules\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859072 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859097 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-httpd-run\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859123 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-logs\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859149 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859171 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4knz\" (UniqueName: \"kubernetes.io/projected/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-kube-api-access-h4knz\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859253 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-scripts\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859291 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-nvme\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859316 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859341 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859366 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-sys\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.859391 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-dev\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962378 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-config-data\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962492 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-run\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962531 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-lib-modules\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962598 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962626 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-httpd-run\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962652 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-logs\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962656 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-run\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962675 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962763 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4knz\" (UniqueName: \"kubernetes.io/projected/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-kube-api-access-h4knz\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962784 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962873 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-scripts\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962892 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-nvme\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962918 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962941 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962963 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-sys\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.962983 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-dev\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.963063 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-dev\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.963089 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-lib-modules\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.963154 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-nvme\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.963531 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.963630 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-httpd-run\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.964343 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-logs\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.964353 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.964577 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-sys\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.964784 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.970205 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-config-data\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.970617 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-scripts\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.990147 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:52 crc kubenswrapper[4806]: I0217 15:38:52.999113 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4knz\" (UniqueName: \"kubernetes.io/projected/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-kube-api-access-h4knz\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:53 crc kubenswrapper[4806]: I0217 15:38:53.005977 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:53 crc kubenswrapper[4806]: I0217 15:38:53.022104 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:38:53 crc kubenswrapper[4806]: I0217 15:38:53.175474 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073c829c-e361-48fb-9e1f-d39bc43ff752" path="/var/lib/kubelet/pods/073c829c-e361-48fb-9e1f-d39bc43ff752/volumes" Feb 17 15:38:53 crc kubenswrapper[4806]: I0217 15:38:53.505575 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:38:53 crc kubenswrapper[4806]: I0217 15:38:53.603468 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerStarted","Data":"f807932bc457f47089e91a466cbe565b2db2390ceec88d0be1dcd9484eefc6fc"} Feb 17 15:38:54 crc kubenswrapper[4806]: I0217 15:38:54.617185 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerStarted","Data":"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8"} Feb 17 15:38:54 crc kubenswrapper[4806]: I0217 15:38:54.618187 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerStarted","Data":"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c"} Feb 17 15:38:54 crc kubenswrapper[4806]: I0217 15:38:54.618213 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerStarted","Data":"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5"} Feb 17 15:38:54 crc kubenswrapper[4806]: I0217 15:38:54.651135 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.651114083 podStartE2EDuration="2.651114083s" podCreationTimestamp="2026-02-17 15:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:38:54.646869998 +0000 UTC m=+1096.177500419" watchObservedRunningTime="2026-02-17 15:38:54.651114083 +0000 UTC m=+1096.181744494" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.023084 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.023689 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.023706 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.053137 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.058183 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.074336 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.692348 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.692415 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.692428 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.708142 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.708525 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:39:03 crc kubenswrapper[4806]: I0217 15:39:03.708589 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:40:34 crc kubenswrapper[4806]: I0217 15:40:34.785001 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:40:34 crc kubenswrapper[4806]: I0217 15:40:34.785587 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:41:04 crc kubenswrapper[4806]: I0217 15:41:04.785140 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:41:04 crc kubenswrapper[4806]: I0217 15:41:04.785777 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.784809 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.785480 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.785560 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.786396 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23c0d5ed88fa20b6bb643ec42855fa80562a9cb2938d34d259221739e817be88"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.786497 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://23c0d5ed88fa20b6bb643ec42855fa80562a9cb2938d34d259221739e817be88" gracePeriod=600 Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.995716 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="23c0d5ed88fa20b6bb643ec42855fa80562a9cb2938d34d259221739e817be88" exitCode=0 Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.995780 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"23c0d5ed88fa20b6bb643ec42855fa80562a9cb2938d34d259221739e817be88"} Feb 17 15:41:34 crc kubenswrapper[4806]: I0217 15:41:34.996180 4806 scope.go:117] "RemoveContainer" containerID="b16af821a4d64d3667cdbb96cac9edee38709ffe5e3eff30ff59770488d6700a" Feb 17 15:41:36 crc kubenswrapper[4806]: I0217 15:41:36.008540 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"50c505eb17a168c57e5352f3069d8cb3bd254804a076e25bc12d0024ccfccf4f"} Feb 17 15:43:30 crc kubenswrapper[4806]: I0217 15:43:30.058208 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-zfdgv"] Feb 17 15:43:30 crc kubenswrapper[4806]: I0217 15:43:30.065618 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-zfdgv"] Feb 17 15:43:31 crc kubenswrapper[4806]: I0217 15:43:31.170741 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8cdc68-a20d-4a5c-8561-a6245c4277cc" path="/var/lib/kubelet/pods/2d8cdc68-a20d-4a5c-8561-a6245c4277cc/volumes" Feb 17 15:43:39 crc kubenswrapper[4806]: I0217 15:43:39.957914 4806 scope.go:117] "RemoveContainer" containerID="d54afb0a88d904663049b0088bbcac9155d64331fd532ed1658eb4d6a1f6c4ee" Feb 17 15:43:40 crc kubenswrapper[4806]: I0217 15:43:40.000820 4806 scope.go:117] "RemoveContainer" containerID="cd549971250af6344d561c17c9e7ae17b2a2b64bf0366f555b6c8ff4905e205e" Feb 17 15:43:40 crc kubenswrapper[4806]: I0217 15:43:40.059268 4806 scope.go:117] "RemoveContainer" containerID="3eb6e59e42aae0771c9fb08320d365359714ff7f682113024e3a2e91a63ca1ef" Feb 17 15:43:40 crc kubenswrapper[4806]: I0217 15:43:40.088134 4806 scope.go:117] "RemoveContainer" containerID="9424863c48cd7cc36b24b3e125720d884aadfc81b5b4104dd24669e943dfd8e9" Feb 17 15:44:04 crc kubenswrapper[4806]: I0217 15:44:04.785100 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:44:04 crc kubenswrapper[4806]: I0217 15:44:04.785584 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:44:34 crc kubenswrapper[4806]: I0217 15:44:34.784262 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:44:34 crc kubenswrapper[4806]: I0217 15:44:34.784895 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:44:40 crc kubenswrapper[4806]: I0217 15:44:40.174469 4806 scope.go:117] "RemoveContainer" containerID="696f7ec3bc821bbccf1af2780552bd1436404e92fc56a9bc576b8f2218ed1644" Feb 17 15:44:40 crc kubenswrapper[4806]: I0217 15:44:40.203431 4806 scope.go:117] "RemoveContainer" containerID="136fa9edc2ab43ff9711cbe9b2702924316c123a25f4c5ae19568b0d9f6dd1fe" Feb 17 15:44:40 crc kubenswrapper[4806]: I0217 15:44:40.230130 4806 scope.go:117] "RemoveContainer" containerID="eea8ded70f093ffd3ebc7eeefc7c38d7915ff70286d2355e120e11c4186b77a4" Feb 17 15:44:40 crc kubenswrapper[4806]: I0217 15:44:40.265984 4806 scope.go:117] "RemoveContainer" containerID="0d414e3f352b4d6c2dd6a65417dd18f087e5bbdcbaa5ef04b4f0e3ff4b613414" Feb 17 15:44:40 crc kubenswrapper[4806]: I0217 15:44:40.315197 4806 scope.go:117] "RemoveContainer" containerID="325c2d9c4132f5584db41d5eaf4bd4748f5ff01e7fd614a5368b2175270a46e1" Feb 17 15:44:46 crc kubenswrapper[4806]: I0217 15:44:46.042323 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-5pkxg"] Feb 17 15:44:46 crc kubenswrapper[4806]: I0217 15:44:46.049477 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-11a8-account-create-update-7chcm"] Feb 17 15:44:46 crc kubenswrapper[4806]: I0217 15:44:46.056649 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-5pkxg"] Feb 17 15:44:46 crc kubenswrapper[4806]: I0217 15:44:46.063665 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-11a8-account-create-update-7chcm"] Feb 17 15:44:47 crc kubenswrapper[4806]: I0217 15:44:47.177393 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa53f26-bda2-425f-a3be-e25f841cd4ed" path="/var/lib/kubelet/pods/baa53f26-bda2-425f-a3be-e25f841cd4ed/volumes" Feb 17 15:44:47 crc kubenswrapper[4806]: I0217 15:44:47.177988 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e03ca136-92c9-4afb-8044-3ddd06b0fd24" path="/var/lib/kubelet/pods/e03ca136-92c9-4afb-8044-3ddd06b0fd24/volumes" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.140592 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl"] Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.141988 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.144658 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.148587 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.152209 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl"] Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.259858 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09fc5eb2-9690-4380-90b4-0886c96c7ff8-secret-volume\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.260113 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc6lq\" (UniqueName: \"kubernetes.io/projected/09fc5eb2-9690-4380-90b4-0886c96c7ff8-kube-api-access-jc6lq\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.260273 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09fc5eb2-9690-4380-90b4-0886c96c7ff8-config-volume\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.362144 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc6lq\" (UniqueName: \"kubernetes.io/projected/09fc5eb2-9690-4380-90b4-0886c96c7ff8-kube-api-access-jc6lq\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.362252 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09fc5eb2-9690-4380-90b4-0886c96c7ff8-config-volume\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.362293 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09fc5eb2-9690-4380-90b4-0886c96c7ff8-secret-volume\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.363244 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09fc5eb2-9690-4380-90b4-0886c96c7ff8-config-volume\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.368253 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09fc5eb2-9690-4380-90b4-0886c96c7ff8-secret-volume\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.377580 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc6lq\" (UniqueName: \"kubernetes.io/projected/09fc5eb2-9690-4380-90b4-0886c96c7ff8-kube-api-access-jc6lq\") pod \"collect-profiles-29522385-g7nhl\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.463896 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:00 crc kubenswrapper[4806]: I0217 15:45:00.969506 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl"] Feb 17 15:45:01 crc kubenswrapper[4806]: I0217 15:45:01.941240 4806 generic.go:334] "Generic (PLEG): container finished" podID="09fc5eb2-9690-4380-90b4-0886c96c7ff8" containerID="02d7482acb227d974955696adb48a442e9160733d76ebb1d5534892bef6fe75f" exitCode=0 Feb 17 15:45:01 crc kubenswrapper[4806]: I0217 15:45:01.941308 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" event={"ID":"09fc5eb2-9690-4380-90b4-0886c96c7ff8","Type":"ContainerDied","Data":"02d7482acb227d974955696adb48a442e9160733d76ebb1d5534892bef6fe75f"} Feb 17 15:45:01 crc kubenswrapper[4806]: I0217 15:45:01.941638 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" event={"ID":"09fc5eb2-9690-4380-90b4-0886c96c7ff8","Type":"ContainerStarted","Data":"6ff0a2c0cb820d5fe0567e154f38e1695ad63f684a71ac1011998b6c89ac873a"} Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.205798 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.308171 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09fc5eb2-9690-4380-90b4-0886c96c7ff8-config-volume\") pod \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.308266 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09fc5eb2-9690-4380-90b4-0886c96c7ff8-secret-volume\") pod \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.308306 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc6lq\" (UniqueName: \"kubernetes.io/projected/09fc5eb2-9690-4380-90b4-0886c96c7ff8-kube-api-access-jc6lq\") pod \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\" (UID: \"09fc5eb2-9690-4380-90b4-0886c96c7ff8\") " Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.308914 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09fc5eb2-9690-4380-90b4-0886c96c7ff8-config-volume" (OuterVolumeSpecName: "config-volume") pod "09fc5eb2-9690-4380-90b4-0886c96c7ff8" (UID: "09fc5eb2-9690-4380-90b4-0886c96c7ff8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.318531 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09fc5eb2-9690-4380-90b4-0886c96c7ff8-kube-api-access-jc6lq" (OuterVolumeSpecName: "kube-api-access-jc6lq") pod "09fc5eb2-9690-4380-90b4-0886c96c7ff8" (UID: "09fc5eb2-9690-4380-90b4-0886c96c7ff8"). InnerVolumeSpecName "kube-api-access-jc6lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.318594 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09fc5eb2-9690-4380-90b4-0886c96c7ff8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "09fc5eb2-9690-4380-90b4-0886c96c7ff8" (UID: "09fc5eb2-9690-4380-90b4-0886c96c7ff8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.410328 4806 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/09fc5eb2-9690-4380-90b4-0886c96c7ff8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.410367 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc6lq\" (UniqueName: \"kubernetes.io/projected/09fc5eb2-9690-4380-90b4-0886c96c7ff8-kube-api-access-jc6lq\") on node \"crc\" DevicePath \"\"" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.410378 4806 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09fc5eb2-9690-4380-90b4-0886c96c7ff8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.957131 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" event={"ID":"09fc5eb2-9690-4380-90b4-0886c96c7ff8","Type":"ContainerDied","Data":"6ff0a2c0cb820d5fe0567e154f38e1695ad63f684a71ac1011998b6c89ac873a"} Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.957183 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ff0a2c0cb820d5fe0567e154f38e1695ad63f684a71ac1011998b6c89ac873a" Feb 17 15:45:03 crc kubenswrapper[4806]: I0217 15:45:03.957244 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522385-g7nhl" Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.784921 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.785576 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.785654 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.786267 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50c505eb17a168c57e5352f3069d8cb3bd254804a076e25bc12d0024ccfccf4f"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.786320 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://50c505eb17a168c57e5352f3069d8cb3bd254804a076e25bc12d0024ccfccf4f" gracePeriod=600 Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.965479 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="50c505eb17a168c57e5352f3069d8cb3bd254804a076e25bc12d0024ccfccf4f" exitCode=0 Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.965540 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"50c505eb17a168c57e5352f3069d8cb3bd254804a076e25bc12d0024ccfccf4f"} Feb 17 15:45:04 crc kubenswrapper[4806]: I0217 15:45:04.966175 4806 scope.go:117] "RemoveContainer" containerID="23c0d5ed88fa20b6bb643ec42855fa80562a9cb2938d34d259221739e817be88" Feb 17 15:45:05 crc kubenswrapper[4806]: I0217 15:45:05.974642 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b"} Feb 17 15:45:07 crc kubenswrapper[4806]: I0217 15:45:07.035929 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-nzrjr"] Feb 17 15:45:07 crc kubenswrapper[4806]: I0217 15:45:07.050997 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-nzrjr"] Feb 17 15:45:07 crc kubenswrapper[4806]: I0217 15:45:07.172097 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abc5787-64ef-4761-ab4e-38aec08f2c1b" path="/var/lib/kubelet/pods/2abc5787-64ef-4761-ab4e-38aec08f2c1b/volumes" Feb 17 15:45:14 crc kubenswrapper[4806]: I0217 15:45:14.040015 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-k8dwb"] Feb 17 15:45:14 crc kubenswrapper[4806]: I0217 15:45:14.051309 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-k8dwb"] Feb 17 15:45:15 crc kubenswrapper[4806]: I0217 15:45:15.175341 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2e9089-1a6a-40e3-bb6a-e61f509ed5c1" path="/var/lib/kubelet/pods/da2e9089-1a6a-40e3-bb6a-e61f509ed5c1/volumes" Feb 17 15:45:40 crc kubenswrapper[4806]: I0217 15:45:40.398784 4806 scope.go:117] "RemoveContainer" containerID="0c62b706621968e83c330da71613e36b3d487bf89e9cc2f0d544e94e1e77839e" Feb 17 15:45:40 crc kubenswrapper[4806]: I0217 15:45:40.434126 4806 scope.go:117] "RemoveContainer" containerID="5d38decaa86b5cf9fbc7970243673df167ac3a039478b355704c6221445c084f" Feb 17 15:45:40 crc kubenswrapper[4806]: I0217 15:45:40.466027 4806 scope.go:117] "RemoveContainer" containerID="7403457696ed16220199b64b0bbf9863f3e1635cb48f2f35e467b3ea6d347b4b" Feb 17 15:45:40 crc kubenswrapper[4806]: I0217 15:45:40.510963 4806 scope.go:117] "RemoveContainer" containerID="535485cb7e686aea0c86925386253f72304d073254f9ec770a712183786aac51" Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.796682 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbfnh"] Feb 17 15:45:55 crc kubenswrapper[4806]: E0217 15:45:55.797796 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09fc5eb2-9690-4380-90b4-0886c96c7ff8" containerName="collect-profiles" Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.797822 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fc5eb2-9690-4380-90b4-0886c96c7ff8" containerName="collect-profiles" Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.798136 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="09fc5eb2-9690-4380-90b4-0886c96c7ff8" containerName="collect-profiles" Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.800275 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.822641 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbfnh"] Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.974689 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5fts\" (UniqueName: \"kubernetes.io/projected/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-kube-api-access-j5fts\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.974800 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-catalog-content\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:55 crc kubenswrapper[4806]: I0217 15:45:55.974842 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-utilities\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.076582 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-utilities\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.076646 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5fts\" (UniqueName: \"kubernetes.io/projected/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-kube-api-access-j5fts\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.076712 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-catalog-content\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.077118 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-utilities\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.077143 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-catalog-content\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.096963 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5fts\" (UniqueName: \"kubernetes.io/projected/5d5616f2-e471-4b7b-9434-e6e438a0cb5d-kube-api-access-j5fts\") pod \"redhat-operators-qbfnh\" (UID: \"5d5616f2-e471-4b7b-9434-e6e438a0cb5d\") " pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.123657 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:45:56 crc kubenswrapper[4806]: I0217 15:45:56.545824 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbfnh"] Feb 17 15:45:57 crc kubenswrapper[4806]: I0217 15:45:57.439354 4806 generic.go:334] "Generic (PLEG): container finished" podID="5d5616f2-e471-4b7b-9434-e6e438a0cb5d" containerID="fffb00d2bc9a6d83c7f90025893f518542e61dda995c1d197e1b4f319e72e081" exitCode=0 Feb 17 15:45:57 crc kubenswrapper[4806]: I0217 15:45:57.439461 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbfnh" event={"ID":"5d5616f2-e471-4b7b-9434-e6e438a0cb5d","Type":"ContainerDied","Data":"fffb00d2bc9a6d83c7f90025893f518542e61dda995c1d197e1b4f319e72e081"} Feb 17 15:45:57 crc kubenswrapper[4806]: I0217 15:45:57.439730 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbfnh" event={"ID":"5d5616f2-e471-4b7b-9434-e6e438a0cb5d","Type":"ContainerStarted","Data":"d033e3fe6741cff045ef886cc7401fb4a209bbc4f39b08f45026d531285e0a11"} Feb 17 15:45:57 crc kubenswrapper[4806]: I0217 15:45:57.441689 4806 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:46:06 crc kubenswrapper[4806]: I0217 15:46:06.505128 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbfnh" event={"ID":"5d5616f2-e471-4b7b-9434-e6e438a0cb5d","Type":"ContainerStarted","Data":"71ab4ac0fba33a8853d5ddd8db2146ec42e1a9312fb5c629f50fc7c74398d225"} Feb 17 15:46:07 crc kubenswrapper[4806]: I0217 15:46:07.514236 4806 generic.go:334] "Generic (PLEG): container finished" podID="5d5616f2-e471-4b7b-9434-e6e438a0cb5d" containerID="71ab4ac0fba33a8853d5ddd8db2146ec42e1a9312fb5c629f50fc7c74398d225" exitCode=0 Feb 17 15:46:07 crc kubenswrapper[4806]: I0217 15:46:07.514288 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbfnh" event={"ID":"5d5616f2-e471-4b7b-9434-e6e438a0cb5d","Type":"ContainerDied","Data":"71ab4ac0fba33a8853d5ddd8db2146ec42e1a9312fb5c629f50fc7c74398d225"} Feb 17 15:46:08 crc kubenswrapper[4806]: I0217 15:46:08.523843 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbfnh" event={"ID":"5d5616f2-e471-4b7b-9434-e6e438a0cb5d","Type":"ContainerStarted","Data":"00107f8e7674411b404dff6a3f631a9c5c78fe24d085e23fe971bcfbe7b2addf"} Feb 17 15:46:08 crc kubenswrapper[4806]: I0217 15:46:08.548591 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbfnh" podStartSLOduration=3.034763805 podStartE2EDuration="13.548566808s" podCreationTimestamp="2026-02-17 15:45:55 +0000 UTC" firstStartedPulling="2026-02-17 15:45:57.441388967 +0000 UTC m=+1518.972019378" lastFinishedPulling="2026-02-17 15:46:07.95519196 +0000 UTC m=+1529.485822381" observedRunningTime="2026-02-17 15:46:08.541097773 +0000 UTC m=+1530.071728204" watchObservedRunningTime="2026-02-17 15:46:08.548566808 +0000 UTC m=+1530.079197259" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.052730 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hdtjs"] Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.054805 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.075155 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdtjs"] Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.135197 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-utilities\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.135355 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77lk2\" (UniqueName: \"kubernetes.io/projected/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-kube-api-access-77lk2\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.135396 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-catalog-content\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.240073 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77lk2\" (UniqueName: \"kubernetes.io/projected/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-kube-api-access-77lk2\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.240221 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-catalog-content\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.240304 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-utilities\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.241652 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-catalog-content\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.241771 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-utilities\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.269889 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77lk2\" (UniqueName: \"kubernetes.io/projected/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-kube-api-access-77lk2\") pod \"certified-operators-hdtjs\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.374863 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:13 crc kubenswrapper[4806]: I0217 15:46:13.846530 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hdtjs"] Feb 17 15:46:14 crc kubenswrapper[4806]: I0217 15:46:14.595767 4806 generic.go:334] "Generic (PLEG): container finished" podID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerID="4a531e2efe9ccc5c53c6be288fc4ba95479b72d50a772c81c720862ace5cee70" exitCode=0 Feb 17 15:46:14 crc kubenswrapper[4806]: I0217 15:46:14.595824 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdtjs" event={"ID":"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3","Type":"ContainerDied","Data":"4a531e2efe9ccc5c53c6be288fc4ba95479b72d50a772c81c720862ace5cee70"} Feb 17 15:46:14 crc kubenswrapper[4806]: I0217 15:46:14.595860 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdtjs" event={"ID":"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3","Type":"ContainerStarted","Data":"e8f7a76f33bff25dcaf517be25689ec2b8add7545602d382a99027ebbb412045"} Feb 17 15:46:16 crc kubenswrapper[4806]: I0217 15:46:16.124017 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:46:16 crc kubenswrapper[4806]: I0217 15:46:16.124276 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:46:16 crc kubenswrapper[4806]: I0217 15:46:16.164756 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:46:16 crc kubenswrapper[4806]: I0217 15:46:16.617246 4806 generic.go:334] "Generic (PLEG): container finished" podID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerID="a1e673933a0fd4645a200f17c4b65685d0d3ef15e9a2ca832992ef4995aaf7f6" exitCode=0 Feb 17 15:46:16 crc kubenswrapper[4806]: I0217 15:46:16.617371 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdtjs" event={"ID":"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3","Type":"ContainerDied","Data":"a1e673933a0fd4645a200f17c4b65685d0d3ef15e9a2ca832992ef4995aaf7f6"} Feb 17 15:46:16 crc kubenswrapper[4806]: I0217 15:46:16.691490 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbfnh" Feb 17 15:46:17 crc kubenswrapper[4806]: I0217 15:46:17.668378 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbfnh"] Feb 17 15:46:18 crc kubenswrapper[4806]: I0217 15:46:18.036658 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpggz"] Feb 17 15:46:18 crc kubenswrapper[4806]: I0217 15:46:18.036886 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vpggz" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="registry-server" containerID="cri-o://87d608d9be99788b64ffb2987212bd05d3c35d5629cded57d7c27edcdac8c318" gracePeriod=2 Feb 17 15:46:19 crc kubenswrapper[4806]: I0217 15:46:19.639976 4806 generic.go:334] "Generic (PLEG): container finished" podID="0d056e41-b3c8-477d-9639-2134afcf7535" containerID="87d608d9be99788b64ffb2987212bd05d3c35d5629cded57d7c27edcdac8c318" exitCode=0 Feb 17 15:46:19 crc kubenswrapper[4806]: I0217 15:46:19.640049 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpggz" event={"ID":"0d056e41-b3c8-477d-9639-2134afcf7535","Type":"ContainerDied","Data":"87d608d9be99788b64ffb2987212bd05d3c35d5629cded57d7c27edcdac8c318"} Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.648048 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdtjs" event={"ID":"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3","Type":"ContainerStarted","Data":"7a0bc37c58dbb89f40d0ed11afb1c0c82ff337cdd4552ae429cde519a25118df"} Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.713460 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.876722 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msk47\" (UniqueName: \"kubernetes.io/projected/0d056e41-b3c8-477d-9639-2134afcf7535-kube-api-access-msk47\") pod \"0d056e41-b3c8-477d-9639-2134afcf7535\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.876813 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-utilities\") pod \"0d056e41-b3c8-477d-9639-2134afcf7535\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.876954 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-catalog-content\") pod \"0d056e41-b3c8-477d-9639-2134afcf7535\" (UID: \"0d056e41-b3c8-477d-9639-2134afcf7535\") " Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.877820 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-utilities" (OuterVolumeSpecName: "utilities") pod "0d056e41-b3c8-477d-9639-2134afcf7535" (UID: "0d056e41-b3c8-477d-9639-2134afcf7535"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.891441 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d056e41-b3c8-477d-9639-2134afcf7535-kube-api-access-msk47" (OuterVolumeSpecName: "kube-api-access-msk47") pod "0d056e41-b3c8-477d-9639-2134afcf7535" (UID: "0d056e41-b3c8-477d-9639-2134afcf7535"). InnerVolumeSpecName "kube-api-access-msk47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.978871 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msk47\" (UniqueName: \"kubernetes.io/projected/0d056e41-b3c8-477d-9639-2134afcf7535-kube-api-access-msk47\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:20 crc kubenswrapper[4806]: I0217 15:46:20.978899 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.022651 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d056e41-b3c8-477d-9639-2134afcf7535" (UID: "0d056e41-b3c8-477d-9639-2134afcf7535"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.080041 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d056e41-b3c8-477d-9639-2134afcf7535-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.656736 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpggz" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.657203 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpggz" event={"ID":"0d056e41-b3c8-477d-9639-2134afcf7535","Type":"ContainerDied","Data":"fdf56cb9309e6edb4d8ce7f889c2eecc3ab7cd14f46fc16fc28828d9bf32b6ce"} Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.657234 4806 scope.go:117] "RemoveContainer" containerID="87d608d9be99788b64ffb2987212bd05d3c35d5629cded57d7c27edcdac8c318" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.674428 4806 scope.go:117] "RemoveContainer" containerID="ac3e50098ae97e696ab0358be018560d988af169a740f9432ffa28494c130b44" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.680815 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hdtjs" podStartSLOduration=2.828931863 podStartE2EDuration="8.680799667s" podCreationTimestamp="2026-02-17 15:46:13 +0000 UTC" firstStartedPulling="2026-02-17 15:46:14.600608294 +0000 UTC m=+1536.131238715" lastFinishedPulling="2026-02-17 15:46:20.452476108 +0000 UTC m=+1541.983106519" observedRunningTime="2026-02-17 15:46:21.676666464 +0000 UTC m=+1543.207296875" watchObservedRunningTime="2026-02-17 15:46:21.680799667 +0000 UTC m=+1543.211430078" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.708569 4806 scope.go:117] "RemoveContainer" containerID="988ab8e7d27c971268902ae4f61248a9c9effd1da13574fcbfb95cb7f2beb990" Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.711513 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpggz"] Feb 17 15:46:21 crc kubenswrapper[4806]: I0217 15:46:21.715529 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vpggz"] Feb 17 15:46:23 crc kubenswrapper[4806]: I0217 15:46:23.174681 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" path="/var/lib/kubelet/pods/0d056e41-b3c8-477d-9639-2134afcf7535/volumes" Feb 17 15:46:23 crc kubenswrapper[4806]: I0217 15:46:23.375749 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:23 crc kubenswrapper[4806]: I0217 15:46:23.375816 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:23 crc kubenswrapper[4806]: I0217 15:46:23.428128 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:33 crc kubenswrapper[4806]: I0217 15:46:33.438736 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:33 crc kubenswrapper[4806]: I0217 15:46:33.494001 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdtjs"] Feb 17 15:46:33 crc kubenswrapper[4806]: I0217 15:46:33.761824 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hdtjs" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="registry-server" containerID="cri-o://7a0bc37c58dbb89f40d0ed11afb1c0c82ff337cdd4552ae429cde519a25118df" gracePeriod=2 Feb 17 15:46:33 crc kubenswrapper[4806]: E0217 15:46:33.983746 4806 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f1679e7_7525_46b1_8f6d_8ff06cfcb9b3.slice/crio-7a0bc37c58dbb89f40d0ed11afb1c0c82ff337cdd4552ae429cde519a25118df.scope\": RecentStats: unable to find data in memory cache]" Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.769854 4806 generic.go:334] "Generic (PLEG): container finished" podID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerID="7a0bc37c58dbb89f40d0ed11afb1c0c82ff337cdd4552ae429cde519a25118df" exitCode=0 Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.769899 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdtjs" event={"ID":"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3","Type":"ContainerDied","Data":"7a0bc37c58dbb89f40d0ed11afb1c0c82ff337cdd4552ae429cde519a25118df"} Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.770238 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hdtjs" event={"ID":"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3","Type":"ContainerDied","Data":"e8f7a76f33bff25dcaf517be25689ec2b8add7545602d382a99027ebbb412045"} Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.770255 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8f7a76f33bff25dcaf517be25689ec2b8add7545602d382a99027ebbb412045" Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.796219 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.912686 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-utilities\") pod \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.912834 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77lk2\" (UniqueName: \"kubernetes.io/projected/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-kube-api-access-77lk2\") pod \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.913069 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-catalog-content\") pod \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\" (UID: \"0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3\") " Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.913933 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-utilities" (OuterVolumeSpecName: "utilities") pod "0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" (UID: "0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.923513 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-kube-api-access-77lk2" (OuterVolumeSpecName: "kube-api-access-77lk2") pod "0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" (UID: "0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3"). InnerVolumeSpecName "kube-api-access-77lk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:46:34 crc kubenswrapper[4806]: I0217 15:46:34.964735 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" (UID: "0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:35 crc kubenswrapper[4806]: I0217 15:46:35.014699 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:35 crc kubenswrapper[4806]: I0217 15:46:35.014749 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:35 crc kubenswrapper[4806]: I0217 15:46:35.014763 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77lk2\" (UniqueName: \"kubernetes.io/projected/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3-kube-api-access-77lk2\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:35 crc kubenswrapper[4806]: I0217 15:46:35.780032 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hdtjs" Feb 17 15:46:35 crc kubenswrapper[4806]: I0217 15:46:35.816317 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hdtjs"] Feb 17 15:46:35 crc kubenswrapper[4806]: I0217 15:46:35.826906 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hdtjs"] Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.289946 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7fw98"] Feb 17 15:46:36 crc kubenswrapper[4806]: E0217 15:46:36.290272 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="registry-server" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290292 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="registry-server" Feb 17 15:46:36 crc kubenswrapper[4806]: E0217 15:46:36.290310 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="extract-utilities" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290319 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="extract-utilities" Feb 17 15:46:36 crc kubenswrapper[4806]: E0217 15:46:36.290333 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="extract-utilities" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290342 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="extract-utilities" Feb 17 15:46:36 crc kubenswrapper[4806]: E0217 15:46:36.290357 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="extract-content" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290365 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="extract-content" Feb 17 15:46:36 crc kubenswrapper[4806]: E0217 15:46:36.290388 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="registry-server" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290395 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="registry-server" Feb 17 15:46:36 crc kubenswrapper[4806]: E0217 15:46:36.290434 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="extract-content" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290444 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="extract-content" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290589 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d056e41-b3c8-477d-9639-2134afcf7535" containerName="registry-server" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.290613 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" containerName="registry-server" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.291747 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.304709 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fw98"] Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.437054 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhtrh\" (UniqueName: \"kubernetes.io/projected/f17393b6-783f-4a68-88ab-36be151a1181-kube-api-access-fhtrh\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.437111 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-utilities\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.437158 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-catalog-content\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.538916 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtrh\" (UniqueName: \"kubernetes.io/projected/f17393b6-783f-4a68-88ab-36be151a1181-kube-api-access-fhtrh\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.539042 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-utilities\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.539590 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-utilities\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.539676 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-catalog-content\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.540051 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-catalog-content\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.559882 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhtrh\" (UniqueName: \"kubernetes.io/projected/f17393b6-783f-4a68-88ab-36be151a1181-kube-api-access-fhtrh\") pod \"redhat-marketplace-7fw98\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:36 crc kubenswrapper[4806]: I0217 15:46:36.611817 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:37 crc kubenswrapper[4806]: I0217 15:46:37.062259 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fw98"] Feb 17 15:46:37 crc kubenswrapper[4806]: W0217 15:46:37.076536 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf17393b6_783f_4a68_88ab_36be151a1181.slice/crio-aa904a8a050ef761f5e4ae3ee1ff0025260fdc8cbc21686e5a13ae34c5349027 WatchSource:0}: Error finding container aa904a8a050ef761f5e4ae3ee1ff0025260fdc8cbc21686e5a13ae34c5349027: Status 404 returned error can't find the container with id aa904a8a050ef761f5e4ae3ee1ff0025260fdc8cbc21686e5a13ae34c5349027 Feb 17 15:46:37 crc kubenswrapper[4806]: I0217 15:46:37.169125 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3" path="/var/lib/kubelet/pods/0f1679e7-7525-46b1-8f6d-8ff06cfcb9b3/volumes" Feb 17 15:46:37 crc kubenswrapper[4806]: I0217 15:46:37.801748 4806 generic.go:334] "Generic (PLEG): container finished" podID="f17393b6-783f-4a68-88ab-36be151a1181" containerID="56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd" exitCode=0 Feb 17 15:46:37 crc kubenswrapper[4806]: I0217 15:46:37.801834 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fw98" event={"ID":"f17393b6-783f-4a68-88ab-36be151a1181","Type":"ContainerDied","Data":"56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd"} Feb 17 15:46:37 crc kubenswrapper[4806]: I0217 15:46:37.801920 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fw98" event={"ID":"f17393b6-783f-4a68-88ab-36be151a1181","Type":"ContainerStarted","Data":"aa904a8a050ef761f5e4ae3ee1ff0025260fdc8cbc21686e5a13ae34c5349027"} Feb 17 15:46:38 crc kubenswrapper[4806]: I0217 15:46:38.817030 4806 generic.go:334] "Generic (PLEG): container finished" podID="f17393b6-783f-4a68-88ab-36be151a1181" containerID="0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5" exitCode=0 Feb 17 15:46:38 crc kubenswrapper[4806]: I0217 15:46:38.817082 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fw98" event={"ID":"f17393b6-783f-4a68-88ab-36be151a1181","Type":"ContainerDied","Data":"0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5"} Feb 17 15:46:39 crc kubenswrapper[4806]: I0217 15:46:39.829770 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fw98" event={"ID":"f17393b6-783f-4a68-88ab-36be151a1181","Type":"ContainerStarted","Data":"665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce"} Feb 17 15:46:39 crc kubenswrapper[4806]: I0217 15:46:39.851796 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7fw98" podStartSLOduration=2.459631625 podStartE2EDuration="3.851774653s" podCreationTimestamp="2026-02-17 15:46:36 +0000 UTC" firstStartedPulling="2026-02-17 15:46:37.80433195 +0000 UTC m=+1559.334962391" lastFinishedPulling="2026-02-17 15:46:39.196475008 +0000 UTC m=+1560.727105419" observedRunningTime="2026-02-17 15:46:39.850142282 +0000 UTC m=+1561.380772713" watchObservedRunningTime="2026-02-17 15:46:39.851774653 +0000 UTC m=+1561.382405074" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.120085 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.120462 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-log" containerID="cri-o://1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5" gracePeriod=30 Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.120590 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-httpd" containerID="cri-o://7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c" gracePeriod=30 Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.120855 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-api" containerID="cri-o://0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8" gracePeriod=30 Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.843449 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849294 4806 generic.go:334] "Generic (PLEG): container finished" podID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerID="0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8" exitCode=0 Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849336 4806 generic.go:334] "Generic (PLEG): container finished" podID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerID="7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c" exitCode=0 Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849345 4806 generic.go:334] "Generic (PLEG): container finished" podID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerID="1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5" exitCode=143 Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849371 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerDied","Data":"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8"} Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849428 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerDied","Data":"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c"} Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849444 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerDied","Data":"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5"} Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849455 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"07fb9dbb-ffae-4648-b2a3-25dd418b76a9","Type":"ContainerDied","Data":"f807932bc457f47089e91a466cbe565b2db2390ceec88d0be1dcd9484eefc6fc"} Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849459 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.849473 4806 scope.go:117] "RemoveContainer" containerID="0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.893044 4806 scope.go:117] "RemoveContainer" containerID="7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.912719 4806 scope.go:117] "RemoveContainer" containerID="1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.933696 4806 scope.go:117] "RemoveContainer" containerID="0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8" Feb 17 15:46:41 crc kubenswrapper[4806]: E0217 15:46:41.934272 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8\": container with ID starting with 0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8 not found: ID does not exist" containerID="0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.934318 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8"} err="failed to get container status \"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8\": rpc error: code = NotFound desc = could not find container \"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8\": container with ID starting with 0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8 not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.934353 4806 scope.go:117] "RemoveContainer" containerID="7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c" Feb 17 15:46:41 crc kubenswrapper[4806]: E0217 15:46:41.934675 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c\": container with ID starting with 7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c not found: ID does not exist" containerID="7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.934703 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c"} err="failed to get container status \"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c\": rpc error: code = NotFound desc = could not find container \"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c\": container with ID starting with 7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.934718 4806 scope.go:117] "RemoveContainer" containerID="1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5" Feb 17 15:46:41 crc kubenswrapper[4806]: E0217 15:46:41.934974 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5\": container with ID starting with 1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5 not found: ID does not exist" containerID="1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935001 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5"} err="failed to get container status \"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5\": rpc error: code = NotFound desc = could not find container \"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5\": container with ID starting with 1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5 not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935019 4806 scope.go:117] "RemoveContainer" containerID="0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935250 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8"} err="failed to get container status \"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8\": rpc error: code = NotFound desc = could not find container \"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8\": container with ID starting with 0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8 not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935270 4806 scope.go:117] "RemoveContainer" containerID="7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935519 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c"} err="failed to get container status \"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c\": rpc error: code = NotFound desc = could not find container \"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c\": container with ID starting with 7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935540 4806 scope.go:117] "RemoveContainer" containerID="1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935825 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5"} err="failed to get container status \"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5\": rpc error: code = NotFound desc = could not find container \"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5\": container with ID starting with 1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5 not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.935846 4806 scope.go:117] "RemoveContainer" containerID="0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.936076 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8"} err="failed to get container status \"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8\": rpc error: code = NotFound desc = could not find container \"0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8\": container with ID starting with 0d134b6204d309ba0cb00b5725e60410d3cc8453613e6fdcf153312b7fb3bbe8 not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.936093 4806 scope.go:117] "RemoveContainer" containerID="7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.936307 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c"} err="failed to get container status \"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c\": rpc error: code = NotFound desc = could not find container \"7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c\": container with ID starting with 7803c6be072fd83940a20f5e3f03f609200d0512ed65aac3c2b3a95d7f36db7c not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.936321 4806 scope.go:117] "RemoveContainer" containerID="1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.937724 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5"} err="failed to get container status \"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5\": rpc error: code = NotFound desc = could not find container \"1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5\": container with ID starting with 1dfa000b90f75c442108f04a2266add7b65bf2a3099cf341ba6b44fd836d19f5 not found: ID does not exist" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959115 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-scripts\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959168 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-var-locks-brick\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959184 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-dev\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959204 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-httpd-run\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959224 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-lib-modules\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959267 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-sys\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959280 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-dev" (OuterVolumeSpecName: "dev") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959301 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-logs\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959319 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959322 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4knz\" (UniqueName: \"kubernetes.io/projected/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-kube-api-access-h4knz\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959340 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-sys" (OuterVolumeSpecName: "sys") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959354 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-run\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959359 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959380 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-nvme\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959431 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-config-data\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959436 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-run" (OuterVolumeSpecName: "run") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959448 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959471 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959463 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-iscsi\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959507 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\" (UID: \"07fb9dbb-ffae-4648-b2a3-25dd418b76a9\") " Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959813 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959829 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959840 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959850 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959863 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959873 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.959924 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-logs" (OuterVolumeSpecName: "logs") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.960016 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.960319 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.964236 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.964506 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-kube-api-access-h4knz" (OuterVolumeSpecName: "kube-api-access-h4knz") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "kube-api-access-h4knz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.964539 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-scripts" (OuterVolumeSpecName: "scripts") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:46:41 crc kubenswrapper[4806]: I0217 15:46:41.965837 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.020228 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-config-data" (OuterVolumeSpecName: "config-data") pod "07fb9dbb-ffae-4648-b2a3-25dd418b76a9" (UID: "07fb9dbb-ffae-4648-b2a3-25dd418b76a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061637 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4knz\" (UniqueName: \"kubernetes.io/projected/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-kube-api-access-h4knz\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061681 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061717 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061726 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061739 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061749 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061758 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.061767 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07fb9dbb-ffae-4648-b2a3-25dd418b76a9-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.076964 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.085613 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.162714 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.162750 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.191630 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.202764 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.566235 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:46:42 crc kubenswrapper[4806]: E0217 15:46:42.566492 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-api" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.566503 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-api" Feb 17 15:46:42 crc kubenswrapper[4806]: E0217 15:46:42.566513 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-httpd" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.566519 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-httpd" Feb 17 15:46:42 crc kubenswrapper[4806]: E0217 15:46:42.566533 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-log" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.566539 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-log" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.566656 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-api" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.566670 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-httpd" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.566682 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" containerName="glance-log" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.567299 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.569097 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.570907 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.571195 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-sjlv5" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.581478 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670454 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-config-data\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670501 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670528 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-lib-modules\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670551 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-854wz\" (UniqueName: \"kubernetes.io/projected/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-kube-api-access-854wz\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670568 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670605 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-scripts\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670621 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670636 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-dev\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670664 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-run\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670679 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-logs\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670705 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-httpd-run\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670722 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670738 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.670776 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-sys\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.772265 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-config-data\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.772341 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.772385 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-lib-modules\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.772556 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-lib-modules\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.772484 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-854wz\" (UniqueName: \"kubernetes.io/projected/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-kube-api-access-854wz\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.772675 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.772907 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.773078 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.773202 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-scripts\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.773769 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.774062 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.774310 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-dev\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.774388 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-run\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.774545 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-logs\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.774467 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-dev\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.774478 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-run\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775211 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-logs\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775326 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-httpd-run\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775500 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775601 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775603 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775721 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775772 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-sys\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775787 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-sys\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.775779 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-httpd-run\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.787045 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-scripts\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.788000 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-config-data\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.799931 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-854wz\" (UniqueName: \"kubernetes.io/projected/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-kube-api-access-854wz\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.802925 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.811185 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:42 crc kubenswrapper[4806]: I0217 15:46:42.883923 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:43 crc kubenswrapper[4806]: I0217 15:46:43.170702 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07fb9dbb-ffae-4648-b2a3-25dd418b76a9" path="/var/lib/kubelet/pods/07fb9dbb-ffae-4648-b2a3-25dd418b76a9/volumes" Feb 17 15:46:43 crc kubenswrapper[4806]: I0217 15:46:43.322695 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:46:43 crc kubenswrapper[4806]: W0217 15:46:43.325111 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cf87b2b_09d2_4b50_937b_aa437f44e3e0.slice/crio-5d4b1e06e2754cf7f2c00dcc9fa239ceafacf044016fab252e5517adc802d553 WatchSource:0}: Error finding container 5d4b1e06e2754cf7f2c00dcc9fa239ceafacf044016fab252e5517adc802d553: Status 404 returned error can't find the container with id 5d4b1e06e2754cf7f2c00dcc9fa239ceafacf044016fab252e5517adc802d553 Feb 17 15:46:43 crc kubenswrapper[4806]: I0217 15:46:43.870161 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8cf87b2b-09d2-4b50-937b-aa437f44e3e0","Type":"ContainerStarted","Data":"947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26"} Feb 17 15:46:43 crc kubenswrapper[4806]: I0217 15:46:43.870821 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8cf87b2b-09d2-4b50-937b-aa437f44e3e0","Type":"ContainerStarted","Data":"55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8"} Feb 17 15:46:43 crc kubenswrapper[4806]: I0217 15:46:43.870841 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8cf87b2b-09d2-4b50-937b-aa437f44e3e0","Type":"ContainerStarted","Data":"5d4b1e06e2754cf7f2c00dcc9fa239ceafacf044016fab252e5517adc802d553"} Feb 17 15:46:43 crc kubenswrapper[4806]: I0217 15:46:43.905604 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.905583668 podStartE2EDuration="1.905583668s" podCreationTimestamp="2026-02-17 15:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:46:43.900285586 +0000 UTC m=+1565.430916047" watchObservedRunningTime="2026-02-17 15:46:43.905583668 +0000 UTC m=+1565.436214089" Feb 17 15:46:46 crc kubenswrapper[4806]: I0217 15:46:46.612122 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:46 crc kubenswrapper[4806]: I0217 15:46:46.612772 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:46 crc kubenswrapper[4806]: I0217 15:46:46.688814 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:46 crc kubenswrapper[4806]: I0217 15:46:46.966315 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:47 crc kubenswrapper[4806]: I0217 15:46:47.015837 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fw98"] Feb 17 15:46:48 crc kubenswrapper[4806]: I0217 15:46:48.917095 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7fw98" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="registry-server" containerID="cri-o://665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce" gracePeriod=2 Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.362110 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.500728 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhtrh\" (UniqueName: \"kubernetes.io/projected/f17393b6-783f-4a68-88ab-36be151a1181-kube-api-access-fhtrh\") pod \"f17393b6-783f-4a68-88ab-36be151a1181\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.501508 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-catalog-content\") pod \"f17393b6-783f-4a68-88ab-36be151a1181\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.501765 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-utilities\") pod \"f17393b6-783f-4a68-88ab-36be151a1181\" (UID: \"f17393b6-783f-4a68-88ab-36be151a1181\") " Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.502493 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-utilities" (OuterVolumeSpecName: "utilities") pod "f17393b6-783f-4a68-88ab-36be151a1181" (UID: "f17393b6-783f-4a68-88ab-36be151a1181"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.502700 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.509039 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17393b6-783f-4a68-88ab-36be151a1181-kube-api-access-fhtrh" (OuterVolumeSpecName: "kube-api-access-fhtrh") pod "f17393b6-783f-4a68-88ab-36be151a1181" (UID: "f17393b6-783f-4a68-88ab-36be151a1181"). InnerVolumeSpecName "kube-api-access-fhtrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.546678 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f17393b6-783f-4a68-88ab-36be151a1181" (UID: "f17393b6-783f-4a68-88ab-36be151a1181"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.604169 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhtrh\" (UniqueName: \"kubernetes.io/projected/f17393b6-783f-4a68-88ab-36be151a1181-kube-api-access-fhtrh\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.604220 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f17393b6-783f-4a68-88ab-36be151a1181-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.930036 4806 generic.go:334] "Generic (PLEG): container finished" podID="f17393b6-783f-4a68-88ab-36be151a1181" containerID="665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce" exitCode=0 Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.930126 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fw98" event={"ID":"f17393b6-783f-4a68-88ab-36be151a1181","Type":"ContainerDied","Data":"665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce"} Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.930158 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fw98" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.930212 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fw98" event={"ID":"f17393b6-783f-4a68-88ab-36be151a1181","Type":"ContainerDied","Data":"aa904a8a050ef761f5e4ae3ee1ff0025260fdc8cbc21686e5a13ae34c5349027"} Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.930250 4806 scope.go:117] "RemoveContainer" containerID="665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce" Feb 17 15:46:49 crc kubenswrapper[4806]: I0217 15:46:49.961938 4806 scope.go:117] "RemoveContainer" containerID="0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5" Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.000811 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fw98"] Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.011895 4806 scope.go:117] "RemoveContainer" containerID="56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd" Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.021702 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fw98"] Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.049386 4806 scope.go:117] "RemoveContainer" containerID="665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce" Feb 17 15:46:50 crc kubenswrapper[4806]: E0217 15:46:50.049709 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce\": container with ID starting with 665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce not found: ID does not exist" containerID="665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce" Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.049746 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce"} err="failed to get container status \"665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce\": rpc error: code = NotFound desc = could not find container \"665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce\": container with ID starting with 665633f3cd1cb898c049ab6d9d00e54c183728ae92199faf8e03cfc9d848c8ce not found: ID does not exist" Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.049771 4806 scope.go:117] "RemoveContainer" containerID="0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5" Feb 17 15:46:50 crc kubenswrapper[4806]: E0217 15:46:50.050301 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5\": container with ID starting with 0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5 not found: ID does not exist" containerID="0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5" Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.050475 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5"} err="failed to get container status \"0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5\": rpc error: code = NotFound desc = could not find container \"0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5\": container with ID starting with 0b5c1708cad64e5a04f1b474e46dfbbf8f6d77c2834f3db8a404d943e5efe8c5 not found: ID does not exist" Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.050628 4806 scope.go:117] "RemoveContainer" containerID="56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd" Feb 17 15:46:50 crc kubenswrapper[4806]: E0217 15:46:50.051067 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd\": container with ID starting with 56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd not found: ID does not exist" containerID="56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd" Feb 17 15:46:50 crc kubenswrapper[4806]: I0217 15:46:50.051114 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd"} err="failed to get container status \"56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd\": rpc error: code = NotFound desc = could not find container \"56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd\": container with ID starting with 56ef6ab11b355cbee34c27a9a3cdaa441d410483207de9a5709613a7fd0b76dd not found: ID does not exist" Feb 17 15:46:51 crc kubenswrapper[4806]: I0217 15:46:51.173747 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17393b6-783f-4a68-88ab-36be151a1181" path="/var/lib/kubelet/pods/f17393b6-783f-4a68-88ab-36be151a1181/volumes" Feb 17 15:46:52 crc kubenswrapper[4806]: I0217 15:46:52.886797 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:52 crc kubenswrapper[4806]: I0217 15:46:52.887196 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:52 crc kubenswrapper[4806]: I0217 15:46:52.931713 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:52 crc kubenswrapper[4806]: I0217 15:46:52.962162 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:52 crc kubenswrapper[4806]: I0217 15:46:52.963941 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:53 crc kubenswrapper[4806]: I0217 15:46:53.967512 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:54 crc kubenswrapper[4806]: I0217 15:46:54.974846 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:46:55 crc kubenswrapper[4806]: I0217 15:46:55.101273 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:55 crc kubenswrapper[4806]: I0217 15:46:55.966802 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.776661 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:46:58 crc kubenswrapper[4806]: E0217 15:46:58.777352 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="registry-server" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.777371 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="registry-server" Feb 17 15:46:58 crc kubenswrapper[4806]: E0217 15:46:58.777386 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="extract-content" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.777397 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="extract-content" Feb 17 15:46:58 crc kubenswrapper[4806]: E0217 15:46:58.777441 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="extract-utilities" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.777452 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="extract-utilities" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.777614 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f17393b6-783f-4a68-88ab-36be151a1181" containerName="registry-server" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.778527 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.790780 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.792091 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.821682 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.823850 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843514 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-run\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843566 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843599 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g529g\" (UniqueName: \"kubernetes.io/projected/7097a9e4-3039-4225-a6bf-b7996aaf828d-kube-api-access-g529g\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843663 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843684 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-lib-modules\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843801 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-httpd-run\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843833 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-config-data\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843852 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-nvme\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843878 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843939 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-dev\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.843965 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.844007 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-scripts\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.844030 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-sys\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.844070 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-logs\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.945608 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-config-data\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.945657 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-logs\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.945675 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.945705 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.945760 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bzdq\" (UniqueName: \"kubernetes.io/projected/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-kube-api-access-4bzdq\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.945984 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946028 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-sys\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946060 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-run\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946089 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-run\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946106 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946135 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-nvme\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946179 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g529g\" (UniqueName: \"kubernetes.io/projected/7097a9e4-3039-4225-a6bf-b7996aaf828d-kube-api-access-g529g\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946224 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-run\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946291 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946329 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-lib-modules\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946354 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-httpd-run\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946354 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-logs\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946395 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946443 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-scripts\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946445 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-lib-modules\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946469 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-config-data\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946490 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-nvme\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946496 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946519 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946688 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-nvme\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946727 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946733 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-httpd-run\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946758 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946780 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-httpd-run\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946884 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-dev\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946901 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-logs\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946917 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-dev\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946935 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946949 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-dev\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.946964 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-lib-modules\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.947019 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.947031 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-scripts\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.947064 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-sys\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.947186 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-sys\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.956590 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-config-data\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.963641 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-scripts\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.969096 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.971995 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g529g\" (UniqueName: \"kubernetes.io/projected/7097a9e4-3039-4225-a6bf-b7996aaf828d-kube-api-access-g529g\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:58 crc kubenswrapper[4806]: I0217 15:46:58.980312 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.048881 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049049 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-scripts\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049136 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-httpd-run\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049158 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049197 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-logs\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049242 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-dev\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049310 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-lib-modules\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049480 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-config-data\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049547 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049607 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049642 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bzdq\" (UniqueName: \"kubernetes.io/projected/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-kube-api-access-4bzdq\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049695 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049727 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-sys\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049766 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-nvme\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049819 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-run\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.049940 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-run\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.050094 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-httpd-run\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.050648 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-logs\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.050720 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-dev\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.050771 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-lib-modules\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.054310 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.056062 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-scripts\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.056169 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-sys\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.056221 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.056334 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-nvme\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.056436 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.057344 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-config-data\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.079206 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.084234 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bzdq\" (UniqueName: \"kubernetes.io/projected/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-kube-api-access-4bzdq\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.089481 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-single-2\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.102558 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.112235 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.594312 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 15:46:59 crc kubenswrapper[4806]: W0217 15:46:59.596548 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb3b2ac_4b70_479b_b2e2_4cc113ef2772.slice/crio-b03575efa54e07fa623603532079d1fe730b719f20b78f8c62685255d1d96bba WatchSource:0}: Error finding container b03575efa54e07fa623603532079d1fe730b719f20b78f8c62685255d1d96bba: Status 404 returned error can't find the container with id b03575efa54e07fa623603532079d1fe730b719f20b78f8c62685255d1d96bba Feb 17 15:46:59 crc kubenswrapper[4806]: I0217 15:46:59.635633 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:46:59 crc kubenswrapper[4806]: W0217 15:46:59.668038 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7097a9e4_3039_4225_a6bf_b7996aaf828d.slice/crio-3e3956852ad3bd7a42439c27f8a985a90d6341766de2e9505bb2f2d21f33d80f WatchSource:0}: Error finding container 3e3956852ad3bd7a42439c27f8a985a90d6341766de2e9505bb2f2d21f33d80f: Status 404 returned error can't find the container with id 3e3956852ad3bd7a42439c27f8a985a90d6341766de2e9505bb2f2d21f33d80f Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.036203 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772","Type":"ContainerStarted","Data":"099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef"} Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.036827 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772","Type":"ContainerStarted","Data":"84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77"} Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.036844 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772","Type":"ContainerStarted","Data":"b03575efa54e07fa623603532079d1fe730b719f20b78f8c62685255d1d96bba"} Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.039805 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7097a9e4-3039-4225-a6bf-b7996aaf828d","Type":"ContainerStarted","Data":"01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc"} Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.039850 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7097a9e4-3039-4225-a6bf-b7996aaf828d","Type":"ContainerStarted","Data":"8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102"} Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.039878 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7097a9e4-3039-4225-a6bf-b7996aaf828d","Type":"ContainerStarted","Data":"3e3956852ad3bd7a42439c27f8a985a90d6341766de2e9505bb2f2d21f33d80f"} Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.072034 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=3.072013608 podStartE2EDuration="3.072013608s" podCreationTimestamp="2026-02-17 15:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:47:00.067545197 +0000 UTC m=+1581.598175628" watchObservedRunningTime="2026-02-17 15:47:00.072013608 +0000 UTC m=+1581.602644029" Feb 17 15:47:00 crc kubenswrapper[4806]: I0217 15:47:00.112597 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.112572065 podStartE2EDuration="3.112572065s" podCreationTimestamp="2026-02-17 15:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:47:00.092684851 +0000 UTC m=+1581.623315302" watchObservedRunningTime="2026-02-17 15:47:00.112572065 +0000 UTC m=+1581.643202496" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.103244 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.103947 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.112691 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.112769 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.142551 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.142927 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.143558 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.190287 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:09 crc kubenswrapper[4806]: I0217 15:47:09.190512 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:10 crc kubenswrapper[4806]: I0217 15:47:10.133713 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:10 crc kubenswrapper[4806]: I0217 15:47:10.134050 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:10 crc kubenswrapper[4806]: I0217 15:47:10.134066 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:11 crc kubenswrapper[4806]: I0217 15:47:11.020742 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:12 crc kubenswrapper[4806]: I0217 15:47:12.054354 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:12 crc kubenswrapper[4806]: I0217 15:47:12.152595 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:47:12 crc kubenswrapper[4806]: I0217 15:47:12.152628 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:47:12 crc kubenswrapper[4806]: I0217 15:47:12.229359 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:12 crc kubenswrapper[4806]: I0217 15:47:12.286780 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:13 crc kubenswrapper[4806]: I0217 15:47:13.116285 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 15:47:13 crc kubenswrapper[4806]: I0217 15:47:13.125989 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:14 crc kubenswrapper[4806]: I0217 15:47:14.167548 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-log" containerID="cri-o://8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102" gracePeriod=30 Feb 17 15:47:14 crc kubenswrapper[4806]: I0217 15:47:14.167672 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-httpd" containerID="cri-o://01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc" gracePeriod=30 Feb 17 15:47:14 crc kubenswrapper[4806]: I0217 15:47:14.168012 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-httpd" containerID="cri-o://099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef" gracePeriod=30 Feb 17 15:47:14 crc kubenswrapper[4806]: I0217 15:47:14.168230 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-log" containerID="cri-o://84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77" gracePeriod=30 Feb 17 15:47:15 crc kubenswrapper[4806]: I0217 15:47:15.177114 4806 generic.go:334] "Generic (PLEG): container finished" podID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerID="8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102" exitCode=143 Feb 17 15:47:15 crc kubenswrapper[4806]: I0217 15:47:15.177206 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7097a9e4-3039-4225-a6bf-b7996aaf828d","Type":"ContainerDied","Data":"8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102"} Feb 17 15:47:15 crc kubenswrapper[4806]: I0217 15:47:15.179752 4806 generic.go:334] "Generic (PLEG): container finished" podID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerID="84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77" exitCode=143 Feb 17 15:47:15 crc kubenswrapper[4806]: I0217 15:47:15.179787 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772","Type":"ContainerDied","Data":"84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77"} Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.788548 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.795567 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.855387 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-scripts\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.855455 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-logs\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.855476 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-run\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.855508 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.856663 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-run" (OuterVolumeSpecName: "run") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.856940 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-logs" (OuterVolumeSpecName: "logs") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857000 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-nvme\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857022 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-dev\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857040 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857067 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-httpd-run\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857086 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-config-data\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857127 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-run\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857150 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-config-data\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857175 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bzdq\" (UniqueName: \"kubernetes.io/projected/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-kube-api-access-4bzdq\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857198 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g529g\" (UniqueName: \"kubernetes.io/projected/7097a9e4-3039-4225-a6bf-b7996aaf828d-kube-api-access-g529g\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857215 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-sys\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857265 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-iscsi\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857285 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-httpd-run\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857321 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-dev\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857346 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-sys\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857367 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857384 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-var-locks-brick\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857422 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-iscsi\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857445 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-lib-modules\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857488 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857516 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-var-locks-brick\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857538 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-nvme\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857574 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-logs\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857599 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-scripts\") pod \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\" (UID: \"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857634 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-lib-modules\") pod \"7097a9e4-3039-4225-a6bf-b7996aaf828d\" (UID: \"7097a9e4-3039-4225-a6bf-b7996aaf828d\") " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.857989 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-sys" (OuterVolumeSpecName: "sys") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858005 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858041 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858061 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858093 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-run" (OuterVolumeSpecName: "run") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858333 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-dev" (OuterVolumeSpecName: "dev") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858363 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858414 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858480 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.858820 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-logs" (OuterVolumeSpecName: "logs") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.859110 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-sys" (OuterVolumeSpecName: "sys") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.866438 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-dev" (OuterVolumeSpecName: "dev") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.866709 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.866754 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.866781 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.866802 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.866832 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.867020 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.873364 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.876960 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-scripts" (OuterVolumeSpecName: "scripts") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.877194 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.878554 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-scripts" (OuterVolumeSpecName: "scripts") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.878589 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-kube-api-access-4bzdq" (OuterVolumeSpecName: "kube-api-access-4bzdq") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "kube-api-access-4bzdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.879544 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.880846 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.884763 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7097a9e4-3039-4225-a6bf-b7996aaf828d-kube-api-access-g529g" (OuterVolumeSpecName: "kube-api-access-g529g") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "kube-api-access-g529g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.901587 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-config-data" (OuterVolumeSpecName: "config-data") pod "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" (UID: "7cb3b2ac-4b70-479b-b2e2-4cc113ef2772"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.917691 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-config-data" (OuterVolumeSpecName: "config-data") pod "7097a9e4-3039-4225-a6bf-b7996aaf828d" (UID: "7097a9e4-3039-4225-a6bf-b7996aaf828d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959509 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959574 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959590 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959602 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959614 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959635 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959645 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959654 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959662 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959670 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959678 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959686 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959698 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959706 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959714 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959726 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959734 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959742 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7097a9e4-3039-4225-a6bf-b7996aaf828d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959749 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959757 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7097a9e4-3039-4225-a6bf-b7996aaf828d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959765 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bzdq\" (UniqueName: \"kubernetes.io/projected/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-kube-api-access-4bzdq\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959774 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g529g\" (UniqueName: \"kubernetes.io/projected/7097a9e4-3039-4225-a6bf-b7996aaf828d-kube-api-access-g529g\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959781 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959789 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7097a9e4-3039-4225-a6bf-b7996aaf828d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959796 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.959803 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.972570 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.972898 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.973045 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 15:47:17 crc kubenswrapper[4806]: I0217 15:47:17.990245 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.061182 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.061214 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.061223 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.061231 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.215726 4806 generic.go:334] "Generic (PLEG): container finished" podID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerID="01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc" exitCode=0 Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.215832 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.215816 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7097a9e4-3039-4225-a6bf-b7996aaf828d","Type":"ContainerDied","Data":"01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc"} Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.215993 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"7097a9e4-3039-4225-a6bf-b7996aaf828d","Type":"ContainerDied","Data":"3e3956852ad3bd7a42439c27f8a985a90d6341766de2e9505bb2f2d21f33d80f"} Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.216070 4806 scope.go:117] "RemoveContainer" containerID="01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.221193 4806 generic.go:334] "Generic (PLEG): container finished" podID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerID="099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef" exitCode=0 Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.221263 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772","Type":"ContainerDied","Data":"099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef"} Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.221313 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"7cb3b2ac-4b70-479b-b2e2-4cc113ef2772","Type":"ContainerDied","Data":"b03575efa54e07fa623603532079d1fe730b719f20b78f8c62685255d1d96bba"} Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.221337 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.249275 4806 scope.go:117] "RemoveContainer" containerID="8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.275922 4806 scope.go:117] "RemoveContainer" containerID="01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc" Feb 17 15:47:18 crc kubenswrapper[4806]: E0217 15:47:18.276543 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc\": container with ID starting with 01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc not found: ID does not exist" containerID="01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.276673 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc"} err="failed to get container status \"01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc\": rpc error: code = NotFound desc = could not find container \"01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc\": container with ID starting with 01b2173304401168e7f142b7051a67ac92b82dd84f53f2bb6cdd66d33fb14cfc not found: ID does not exist" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.276713 4806 scope.go:117] "RemoveContainer" containerID="8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102" Feb 17 15:47:18 crc kubenswrapper[4806]: E0217 15:47:18.280390 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102\": container with ID starting with 8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102 not found: ID does not exist" containerID="8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.280464 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102"} err="failed to get container status \"8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102\": rpc error: code = NotFound desc = could not find container \"8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102\": container with ID starting with 8cbffac688f37bf09d06201a1f45cd9d8f1b48cb5ab7d9912b65f8aa34cab102 not found: ID does not exist" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.280519 4806 scope.go:117] "RemoveContainer" containerID="099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.281549 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.290612 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.303015 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.312773 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.318879 4806 scope.go:117] "RemoveContainer" containerID="84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.354979 4806 scope.go:117] "RemoveContainer" containerID="099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef" Feb 17 15:47:18 crc kubenswrapper[4806]: E0217 15:47:18.355512 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef\": container with ID starting with 099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef not found: ID does not exist" containerID="099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.355577 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef"} err="failed to get container status \"099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef\": rpc error: code = NotFound desc = could not find container \"099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef\": container with ID starting with 099e3189454faeede20fe4f175109fc18ebc8c5e679c8366b361061fc1bf8bef not found: ID does not exist" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.355617 4806 scope.go:117] "RemoveContainer" containerID="84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77" Feb 17 15:47:18 crc kubenswrapper[4806]: E0217 15:47:18.356187 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77\": container with ID starting with 84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77 not found: ID does not exist" containerID="84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77" Feb 17 15:47:18 crc kubenswrapper[4806]: I0217 15:47:18.356231 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77"} err="failed to get container status \"84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77\": rpc error: code = NotFound desc = could not find container \"84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77\": container with ID starting with 84aee304350909781110c181f1b42f0045378b944448a3b0116e8ae878e21d77 not found: ID does not exist" Feb 17 15:47:19 crc kubenswrapper[4806]: I0217 15:47:19.175571 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" path="/var/lib/kubelet/pods/7097a9e4-3039-4225-a6bf-b7996aaf828d/volumes" Feb 17 15:47:19 crc kubenswrapper[4806]: I0217 15:47:19.176446 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" path="/var/lib/kubelet/pods/7cb3b2ac-4b70-479b-b2e2-4cc113ef2772/volumes" Feb 17 15:47:19 crc kubenswrapper[4806]: I0217 15:47:19.454038 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:47:19 crc kubenswrapper[4806]: I0217 15:47:19.454962 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-log" containerID="cri-o://55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8" gracePeriod=30 Feb 17 15:47:19 crc kubenswrapper[4806]: I0217 15:47:19.455144 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-httpd" containerID="cri-o://947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26" gracePeriod=30 Feb 17 15:47:20 crc kubenswrapper[4806]: I0217 15:47:20.258811 4806 generic.go:334] "Generic (PLEG): container finished" podID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerID="55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8" exitCode=143 Feb 17 15:47:20 crc kubenswrapper[4806]: I0217 15:47:20.258937 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8cf87b2b-09d2-4b50-937b-aa437f44e3e0","Type":"ContainerDied","Data":"55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8"} Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.894780 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943476 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-config-data\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943521 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-scripts\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943541 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-var-locks-brick\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943561 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-run\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943581 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-lib-modules\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943611 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-iscsi\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943673 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-httpd-run\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943702 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-dev\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943717 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943735 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-sys\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943750 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943821 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-854wz\" (UniqueName: \"kubernetes.io/projected/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-kube-api-access-854wz\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943842 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-logs\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.943877 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-nvme\") pod \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\" (UID: \"8cf87b2b-09d2-4b50-937b-aa437f44e3e0\") " Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.944241 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.945694 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.945686 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-run" (OuterVolumeSpecName: "run") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.945732 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.945754 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-sys" (OuterVolumeSpecName: "sys") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.945735 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.945772 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-dev" (OuterVolumeSpecName: "dev") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.945776 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.946316 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-logs" (OuterVolumeSpecName: "logs") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.951976 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.951910 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-kube-api-access-854wz" (OuterVolumeSpecName: "kube-api-access-854wz") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "kube-api-access-854wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.952738 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.953794 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-scripts" (OuterVolumeSpecName: "scripts") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:22 crc kubenswrapper[4806]: I0217 15:47:22.998186 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-config-data" (OuterVolumeSpecName: "config-data") pod "8cf87b2b-09d2-4b50-937b-aa437f44e3e0" (UID: "8cf87b2b-09d2-4b50-937b-aa437f44e3e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.045937 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.045972 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.045982 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.045990 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.045998 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046034 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046043 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046056 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046067 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-854wz\" (UniqueName: \"kubernetes.io/projected/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-kube-api-access-854wz\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046077 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046085 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046093 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046100 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.046108 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8cf87b2b-09d2-4b50-937b-aa437f44e3e0-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.059377 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.059606 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.148253 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.148295 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.289161 4806 generic.go:334] "Generic (PLEG): container finished" podID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerID="947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26" exitCode=0 Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.289218 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8cf87b2b-09d2-4b50-937b-aa437f44e3e0","Type":"ContainerDied","Data":"947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26"} Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.289270 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8cf87b2b-09d2-4b50-937b-aa437f44e3e0","Type":"ContainerDied","Data":"5d4b1e06e2754cf7f2c00dcc9fa239ceafacf044016fab252e5517adc802d553"} Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.289289 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.289299 4806 scope.go:117] "RemoveContainer" containerID="947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.319919 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.327176 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.340114 4806 scope.go:117] "RemoveContainer" containerID="55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.360623 4806 scope.go:117] "RemoveContainer" containerID="947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26" Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.361261 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26\": container with ID starting with 947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26 not found: ID does not exist" containerID="947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.361320 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26"} err="failed to get container status \"947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26\": rpc error: code = NotFound desc = could not find container \"947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26\": container with ID starting with 947c4fc936b3a104eb44b342bd5dbaddc2cb24a37fb3fd71f7f851b7421bdf26 not found: ID does not exist" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.361363 4806 scope.go:117] "RemoveContainer" containerID="55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8" Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.361823 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8\": container with ID starting with 55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8 not found: ID does not exist" containerID="55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.361904 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8"} err="failed to get container status \"55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8\": rpc error: code = NotFound desc = could not find container \"55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8\": container with ID starting with 55817b8ed9941ab70b92d72c5f2a0f5c939df7b1338f442bef04c62a21f366d8 not found: ID does not exist" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.778136 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-h5nmw"] Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.787860 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-h5nmw"] Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814329 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancef285-account-delete-5256d"] Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.814692 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814709 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.814719 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814726 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.814737 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814744 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.814753 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814759 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.814768 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814774 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: E0217 15:47:23.814782 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814788 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814913 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814924 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814935 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814944 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="7097a9e4-3039-4225-a6bf-b7996aaf828d" containerName="glance-log" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814956 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb3b2ac-4b70-479b-b2e2-4cc113ef2772" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.814967 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" containerName="glance-httpd" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.815487 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.822839 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef285-account-delete-5256d"] Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.862901 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkftr\" (UniqueName: \"kubernetes.io/projected/68daaf4a-90ad-442c-9bfd-41755ff4f788-kube-api-access-kkftr\") pod \"glancef285-account-delete-5256d\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.863135 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68daaf4a-90ad-442c-9bfd-41755ff4f788-operator-scripts\") pod \"glancef285-account-delete-5256d\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.964586 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkftr\" (UniqueName: \"kubernetes.io/projected/68daaf4a-90ad-442c-9bfd-41755ff4f788-kube-api-access-kkftr\") pod \"glancef285-account-delete-5256d\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.964648 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68daaf4a-90ad-442c-9bfd-41755ff4f788-operator-scripts\") pod \"glancef285-account-delete-5256d\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.965608 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68daaf4a-90ad-442c-9bfd-41755ff4f788-operator-scripts\") pod \"glancef285-account-delete-5256d\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:23 crc kubenswrapper[4806]: I0217 15:47:23.989181 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkftr\" (UniqueName: \"kubernetes.io/projected/68daaf4a-90ad-442c-9bfd-41755ff4f788-kube-api-access-kkftr\") pod \"glancef285-account-delete-5256d\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:24 crc kubenswrapper[4806]: I0217 15:47:24.172901 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:24 crc kubenswrapper[4806]: I0217 15:47:24.671492 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancef285-account-delete-5256d"] Feb 17 15:47:25 crc kubenswrapper[4806]: I0217 15:47:25.169752 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c699d8-9f7e-4f32-822e-3e5f0c367100" path="/var/lib/kubelet/pods/71c699d8-9f7e-4f32-822e-3e5f0c367100/volumes" Feb 17 15:47:25 crc kubenswrapper[4806]: I0217 15:47:25.170782 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cf87b2b-09d2-4b50-937b-aa437f44e3e0" path="/var/lib/kubelet/pods/8cf87b2b-09d2-4b50-937b-aa437f44e3e0/volumes" Feb 17 15:47:25 crc kubenswrapper[4806]: I0217 15:47:25.308202 4806 generic.go:334] "Generic (PLEG): container finished" podID="68daaf4a-90ad-442c-9bfd-41755ff4f788" containerID="1ecfd55fb8088461780b69f495b7a12cb233e5f734d5cdca10f292818a840d6b" exitCode=0 Feb 17 15:47:25 crc kubenswrapper[4806]: I0217 15:47:25.308246 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef285-account-delete-5256d" event={"ID":"68daaf4a-90ad-442c-9bfd-41755ff4f788","Type":"ContainerDied","Data":"1ecfd55fb8088461780b69f495b7a12cb233e5f734d5cdca10f292818a840d6b"} Feb 17 15:47:25 crc kubenswrapper[4806]: I0217 15:47:25.308287 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef285-account-delete-5256d" event={"ID":"68daaf4a-90ad-442c-9bfd-41755ff4f788","Type":"ContainerStarted","Data":"069eead441affdce8fc06518ff8689ef68b8cfdfc75739432f09e3e6f7fbde39"} Feb 17 15:47:26 crc kubenswrapper[4806]: I0217 15:47:26.651900 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:26 crc kubenswrapper[4806]: I0217 15:47:26.707587 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkftr\" (UniqueName: \"kubernetes.io/projected/68daaf4a-90ad-442c-9bfd-41755ff4f788-kube-api-access-kkftr\") pod \"68daaf4a-90ad-442c-9bfd-41755ff4f788\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " Feb 17 15:47:26 crc kubenswrapper[4806]: I0217 15:47:26.708238 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68daaf4a-90ad-442c-9bfd-41755ff4f788-operator-scripts\") pod \"68daaf4a-90ad-442c-9bfd-41755ff4f788\" (UID: \"68daaf4a-90ad-442c-9bfd-41755ff4f788\") " Feb 17 15:47:26 crc kubenswrapper[4806]: I0217 15:47:26.708818 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68daaf4a-90ad-442c-9bfd-41755ff4f788-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68daaf4a-90ad-442c-9bfd-41755ff4f788" (UID: "68daaf4a-90ad-442c-9bfd-41755ff4f788"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:47:26 crc kubenswrapper[4806]: I0217 15:47:26.709031 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68daaf4a-90ad-442c-9bfd-41755ff4f788-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:26 crc kubenswrapper[4806]: I0217 15:47:26.714025 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68daaf4a-90ad-442c-9bfd-41755ff4f788-kube-api-access-kkftr" (OuterVolumeSpecName: "kube-api-access-kkftr") pod "68daaf4a-90ad-442c-9bfd-41755ff4f788" (UID: "68daaf4a-90ad-442c-9bfd-41755ff4f788"). InnerVolumeSpecName "kube-api-access-kkftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:26 crc kubenswrapper[4806]: I0217 15:47:26.810436 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkftr\" (UniqueName: \"kubernetes.io/projected/68daaf4a-90ad-442c-9bfd-41755ff4f788-kube-api-access-kkftr\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.042518 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:47:27 crc kubenswrapper[4806]: E0217 15:47:27.042845 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68daaf4a-90ad-442c-9bfd-41755ff4f788" containerName="mariadb-account-delete" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.042884 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="68daaf4a-90ad-442c-9bfd-41755ff4f788" containerName="mariadb-account-delete" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.043063 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="68daaf4a-90ad-442c-9bfd-41755ff4f788" containerName="mariadb-account-delete" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.043596 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.047972 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.048177 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.049375 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.052278 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-dfdfw" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.059623 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.115753 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-scripts\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.115821 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lxb\" (UniqueName: \"kubernetes.io/projected/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-kube-api-access-k4lxb\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.115873 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-config-secret\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.116262 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-config\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.217987 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-scripts\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.218210 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lxb\" (UniqueName: \"kubernetes.io/projected/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-kube-api-access-k4lxb\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.218336 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-config-secret\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.218453 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-config\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.219447 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-scripts\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.219979 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-config\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.228817 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-openstack-config-secret\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.233866 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lxb\" (UniqueName: \"kubernetes.io/projected/7216f040-d3bb-4b04-a47f-c8e878cc6f1f-kube-api-access-k4lxb\") pod \"openstackclient\" (UID: \"7216f040-d3bb-4b04-a47f-c8e878cc6f1f\") " pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.327338 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancef285-account-delete-5256d" event={"ID":"68daaf4a-90ad-442c-9bfd-41755ff4f788","Type":"ContainerDied","Data":"069eead441affdce8fc06518ff8689ef68b8cfdfc75739432f09e3e6f7fbde39"} Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.327377 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="069eead441affdce8fc06518ff8689ef68b8cfdfc75739432f09e3e6f7fbde39" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.327421 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancef285-account-delete-5256d" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.404359 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Feb 17 15:47:27 crc kubenswrapper[4806]: I0217 15:47:27.838693 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Feb 17 15:47:27 crc kubenswrapper[4806]: W0217 15:47:27.854655 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7216f040_d3bb_4b04_a47f_c8e878cc6f1f.slice/crio-daa7ef9f3fc26306ad39d7fc100bac2e91c537bd6d5494fb3c30a625062a9c5a WatchSource:0}: Error finding container daa7ef9f3fc26306ad39d7fc100bac2e91c537bd6d5494fb3c30a625062a9c5a: Status 404 returned error can't find the container with id daa7ef9f3fc26306ad39d7fc100bac2e91c537bd6d5494fb3c30a625062a9c5a Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.341285 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7216f040-d3bb-4b04-a47f-c8e878cc6f1f","Type":"ContainerStarted","Data":"a0225475998635ee0cc38bc23011143eac7be79de7c00ffcd67603bd437ac146"} Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.341727 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"7216f040-d3bb-4b04-a47f-c8e878cc6f1f","Type":"ContainerStarted","Data":"daa7ef9f3fc26306ad39d7fc100bac2e91c537bd6d5494fb3c30a625062a9c5a"} Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.375820 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=1.375784353 podStartE2EDuration="1.375784353s" podCreationTimestamp="2026-02-17 15:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:47:28.362583545 +0000 UTC m=+1609.893214036" watchObservedRunningTime="2026-02-17 15:47:28.375784353 +0000 UTC m=+1609.906414804" Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.860540 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-blg27"] Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.876604 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-blg27"] Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.886826 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-f285-account-create-update-vrq6c"] Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.896331 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-f285-account-create-update-vrq6c"] Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.904333 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancef285-account-delete-5256d"] Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.912611 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancef285-account-delete-5256d"] Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.931294 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-2k645"] Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.932155 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:28 crc kubenswrapper[4806]: I0217 15:47:28.953504 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-2k645"] Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.035214 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-2cd3-account-create-update-m69sk"] Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.036000 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.040040 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.045098 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7djkb\" (UniqueName: \"kubernetes.io/projected/49c1af8c-e873-4fb9-bf33-7870d77f2648-kube-api-access-7djkb\") pod \"glance-db-create-2k645\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.045348 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c1af8c-e873-4fb9-bf33-7870d77f2648-operator-scripts\") pod \"glance-db-create-2k645\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.047302 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2cd3-account-create-update-m69sk"] Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.146584 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7djkb\" (UniqueName: \"kubernetes.io/projected/49c1af8c-e873-4fb9-bf33-7870d77f2648-kube-api-access-7djkb\") pod \"glance-db-create-2k645\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.146648 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrcqr\" (UniqueName: \"kubernetes.io/projected/42352269-0456-403d-8e34-af83a7c51d0b-kube-api-access-mrcqr\") pod \"glance-2cd3-account-create-update-m69sk\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.146737 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c1af8c-e873-4fb9-bf33-7870d77f2648-operator-scripts\") pod \"glance-db-create-2k645\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.147102 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42352269-0456-403d-8e34-af83a7c51d0b-operator-scripts\") pod \"glance-2cd3-account-create-update-m69sk\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.147698 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c1af8c-e873-4fb9-bf33-7870d77f2648-operator-scripts\") pod \"glance-db-create-2k645\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.171225 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="137fdb5e-a134-487c-bf9f-19991fdf35f3" path="/var/lib/kubelet/pods/137fdb5e-a134-487c-bf9f-19991fdf35f3/volumes" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.172077 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68daaf4a-90ad-442c-9bfd-41755ff4f788" path="/var/lib/kubelet/pods/68daaf4a-90ad-442c-9bfd-41755ff4f788/volumes" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.172198 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7djkb\" (UniqueName: \"kubernetes.io/projected/49c1af8c-e873-4fb9-bf33-7870d77f2648-kube-api-access-7djkb\") pod \"glance-db-create-2k645\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.172731 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953bc43c-2b9c-4d7f-b38d-c0d35a394a6b" path="/var/lib/kubelet/pods/953bc43c-2b9c-4d7f-b38d-c0d35a394a6b/volumes" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.248818 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42352269-0456-403d-8e34-af83a7c51d0b-operator-scripts\") pod \"glance-2cd3-account-create-update-m69sk\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.248935 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrcqr\" (UniqueName: \"kubernetes.io/projected/42352269-0456-403d-8e34-af83a7c51d0b-kube-api-access-mrcqr\") pod \"glance-2cd3-account-create-update-m69sk\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.250940 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42352269-0456-403d-8e34-af83a7c51d0b-operator-scripts\") pod \"glance-2cd3-account-create-update-m69sk\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.254759 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.286298 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrcqr\" (UniqueName: \"kubernetes.io/projected/42352269-0456-403d-8e34-af83a7c51d0b-kube-api-access-mrcqr\") pod \"glance-2cd3-account-create-update-m69sk\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.350687 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.657873 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2cd3-account-create-update-m69sk"] Feb 17 15:47:29 crc kubenswrapper[4806]: I0217 15:47:29.695590 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-2k645"] Feb 17 15:47:29 crc kubenswrapper[4806]: W0217 15:47:29.699445 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49c1af8c_e873_4fb9_bf33_7870d77f2648.slice/crio-e6ba24d6f8590b29af4d91f0696818e8ca932930608295a2a7c979f6655c4c05 WatchSource:0}: Error finding container e6ba24d6f8590b29af4d91f0696818e8ca932930608295a2a7c979f6655c4c05: Status 404 returned error can't find the container with id e6ba24d6f8590b29af4d91f0696818e8ca932930608295a2a7c979f6655c4c05 Feb 17 15:47:30 crc kubenswrapper[4806]: I0217 15:47:30.365150 4806 generic.go:334] "Generic (PLEG): container finished" podID="49c1af8c-e873-4fb9-bf33-7870d77f2648" containerID="5e3a6a543eeff88bf244f271b055da6a0345de68d9d120c0ae04a1e81f60cbb6" exitCode=0 Feb 17 15:47:30 crc kubenswrapper[4806]: I0217 15:47:30.365299 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-2k645" event={"ID":"49c1af8c-e873-4fb9-bf33-7870d77f2648","Type":"ContainerDied","Data":"5e3a6a543eeff88bf244f271b055da6a0345de68d9d120c0ae04a1e81f60cbb6"} Feb 17 15:47:30 crc kubenswrapper[4806]: I0217 15:47:30.365645 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-2k645" event={"ID":"49c1af8c-e873-4fb9-bf33-7870d77f2648","Type":"ContainerStarted","Data":"e6ba24d6f8590b29af4d91f0696818e8ca932930608295a2a7c979f6655c4c05"} Feb 17 15:47:30 crc kubenswrapper[4806]: I0217 15:47:30.368266 4806 generic.go:334] "Generic (PLEG): container finished" podID="42352269-0456-403d-8e34-af83a7c51d0b" containerID="668d27e30ecf98055014a3d366d58d9090f8a08c5e498c67e0d7a1cff38575d3" exitCode=0 Feb 17 15:47:30 crc kubenswrapper[4806]: I0217 15:47:30.368328 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" event={"ID":"42352269-0456-403d-8e34-af83a7c51d0b","Type":"ContainerDied","Data":"668d27e30ecf98055014a3d366d58d9090f8a08c5e498c67e0d7a1cff38575d3"} Feb 17 15:47:30 crc kubenswrapper[4806]: I0217 15:47:30.368369 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" event={"ID":"42352269-0456-403d-8e34-af83a7c51d0b","Type":"ContainerStarted","Data":"8eb9455404b3f6385ed58c9887c24b4d7d59817525a91f68f398fda2e6adbb47"} Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.852478 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.857745 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.885381 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrcqr\" (UniqueName: \"kubernetes.io/projected/42352269-0456-403d-8e34-af83a7c51d0b-kube-api-access-mrcqr\") pod \"42352269-0456-403d-8e34-af83a7c51d0b\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.885529 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42352269-0456-403d-8e34-af83a7c51d0b-operator-scripts\") pod \"42352269-0456-403d-8e34-af83a7c51d0b\" (UID: \"42352269-0456-403d-8e34-af83a7c51d0b\") " Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.886433 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42352269-0456-403d-8e34-af83a7c51d0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42352269-0456-403d-8e34-af83a7c51d0b" (UID: "42352269-0456-403d-8e34-af83a7c51d0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.892577 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42352269-0456-403d-8e34-af83a7c51d0b-kube-api-access-mrcqr" (OuterVolumeSpecName: "kube-api-access-mrcqr") pod "42352269-0456-403d-8e34-af83a7c51d0b" (UID: "42352269-0456-403d-8e34-af83a7c51d0b"). InnerVolumeSpecName "kube-api-access-mrcqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.986895 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7djkb\" (UniqueName: \"kubernetes.io/projected/49c1af8c-e873-4fb9-bf33-7870d77f2648-kube-api-access-7djkb\") pod \"49c1af8c-e873-4fb9-bf33-7870d77f2648\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.986950 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c1af8c-e873-4fb9-bf33-7870d77f2648-operator-scripts\") pod \"49c1af8c-e873-4fb9-bf33-7870d77f2648\" (UID: \"49c1af8c-e873-4fb9-bf33-7870d77f2648\") " Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.987177 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrcqr\" (UniqueName: \"kubernetes.io/projected/42352269-0456-403d-8e34-af83a7c51d0b-kube-api-access-mrcqr\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.987194 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42352269-0456-403d-8e34-af83a7c51d0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.987784 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c1af8c-e873-4fb9-bf33-7870d77f2648-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49c1af8c-e873-4fb9-bf33-7870d77f2648" (UID: "49c1af8c-e873-4fb9-bf33-7870d77f2648"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 15:47:31 crc kubenswrapper[4806]: I0217 15:47:31.992591 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c1af8c-e873-4fb9-bf33-7870d77f2648-kube-api-access-7djkb" (OuterVolumeSpecName: "kube-api-access-7djkb") pod "49c1af8c-e873-4fb9-bf33-7870d77f2648" (UID: "49c1af8c-e873-4fb9-bf33-7870d77f2648"). InnerVolumeSpecName "kube-api-access-7djkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.088519 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7djkb\" (UniqueName: \"kubernetes.io/projected/49c1af8c-e873-4fb9-bf33-7870d77f2648-kube-api-access-7djkb\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.088560 4806 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49c1af8c-e873-4fb9-bf33-7870d77f2648-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.394211 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-2k645" event={"ID":"49c1af8c-e873-4fb9-bf33-7870d77f2648","Type":"ContainerDied","Data":"e6ba24d6f8590b29af4d91f0696818e8ca932930608295a2a7c979f6655c4c05"} Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.394286 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6ba24d6f8590b29af4d91f0696818e8ca932930608295a2a7c979f6655c4c05" Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.394234 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-2k645" Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.397363 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" event={"ID":"42352269-0456-403d-8e34-af83a7c51d0b","Type":"ContainerDied","Data":"8eb9455404b3f6385ed58c9887c24b4d7d59817525a91f68f398fda2e6adbb47"} Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.397459 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2cd3-account-create-update-m69sk" Feb 17 15:47:32 crc kubenswrapper[4806]: I0217 15:47:32.397451 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eb9455404b3f6385ed58c9887c24b4d7d59817525a91f68f398fda2e6adbb47" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.275700 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-9qrgm"] Feb 17 15:47:34 crc kubenswrapper[4806]: E0217 15:47:34.276613 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42352269-0456-403d-8e34-af83a7c51d0b" containerName="mariadb-account-create-update" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.276644 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="42352269-0456-403d-8e34-af83a7c51d0b" containerName="mariadb-account-create-update" Feb 17 15:47:34 crc kubenswrapper[4806]: E0217 15:47:34.276687 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c1af8c-e873-4fb9-bf33-7870d77f2648" containerName="mariadb-database-create" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.276705 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c1af8c-e873-4fb9-bf33-7870d77f2648" containerName="mariadb-database-create" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.277053 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c1af8c-e873-4fb9-bf33-7870d77f2648" containerName="mariadb-database-create" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.277092 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="42352269-0456-403d-8e34-af83a7c51d0b" containerName="mariadb-account-create-update" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.278191 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.281328 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.281400 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-xsb79" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.292532 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-9qrgm"] Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.321157 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-config-data\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.321254 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpgkm\" (UniqueName: \"kubernetes.io/projected/f17db029-fc5e-47e0-a010-23beeb370f3f-kube-api-access-hpgkm\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.321287 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-db-sync-config-data\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.422746 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpgkm\" (UniqueName: \"kubernetes.io/projected/f17db029-fc5e-47e0-a010-23beeb370f3f-kube-api-access-hpgkm\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.422828 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-db-sync-config-data\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.422982 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-config-data\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.429936 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-config-data\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.433868 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-db-sync-config-data\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.446162 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpgkm\" (UniqueName: \"kubernetes.io/projected/f17db029-fc5e-47e0-a010-23beeb370f3f-kube-api-access-hpgkm\") pod \"glance-db-sync-9qrgm\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.596894 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.785012 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:47:34 crc kubenswrapper[4806]: I0217 15:47:34.785331 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:47:35 crc kubenswrapper[4806]: I0217 15:47:35.111600 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-9qrgm"] Feb 17 15:47:35 crc kubenswrapper[4806]: I0217 15:47:35.423847 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-9qrgm" event={"ID":"f17db029-fc5e-47e0-a010-23beeb370f3f","Type":"ContainerStarted","Data":"7c8d3b2367113abe1c76934b2e237d9cd52189e95b5da05da313315e36726493"} Feb 17 15:47:36 crc kubenswrapper[4806]: I0217 15:47:36.436260 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-9qrgm" event={"ID":"f17db029-fc5e-47e0-a010-23beeb370f3f","Type":"ContainerStarted","Data":"a56371cfcab01cc152e17a9e82a81cf66fb0b96695477a3c7960d0e9ef30f3de"} Feb 17 15:47:36 crc kubenswrapper[4806]: I0217 15:47:36.467497 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-9qrgm" podStartSLOduration=2.467472169 podStartE2EDuration="2.467472169s" podCreationTimestamp="2026-02-17 15:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:47:36.456269631 +0000 UTC m=+1617.986900082" watchObservedRunningTime="2026-02-17 15:47:36.467472169 +0000 UTC m=+1617.998102620" Feb 17 15:47:38 crc kubenswrapper[4806]: I0217 15:47:38.453053 4806 generic.go:334] "Generic (PLEG): container finished" podID="f17db029-fc5e-47e0-a010-23beeb370f3f" containerID="a56371cfcab01cc152e17a9e82a81cf66fb0b96695477a3c7960d0e9ef30f3de" exitCode=0 Feb 17 15:47:38 crc kubenswrapper[4806]: I0217 15:47:38.453143 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-9qrgm" event={"ID":"f17db029-fc5e-47e0-a010-23beeb370f3f","Type":"ContainerDied","Data":"a56371cfcab01cc152e17a9e82a81cf66fb0b96695477a3c7960d0e9ef30f3de"} Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.784423 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.809342 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-config-data\") pod \"f17db029-fc5e-47e0-a010-23beeb370f3f\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.809459 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-db-sync-config-data\") pod \"f17db029-fc5e-47e0-a010-23beeb370f3f\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.809568 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpgkm\" (UniqueName: \"kubernetes.io/projected/f17db029-fc5e-47e0-a010-23beeb370f3f-kube-api-access-hpgkm\") pod \"f17db029-fc5e-47e0-a010-23beeb370f3f\" (UID: \"f17db029-fc5e-47e0-a010-23beeb370f3f\") " Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.818384 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f17db029-fc5e-47e0-a010-23beeb370f3f" (UID: "f17db029-fc5e-47e0-a010-23beeb370f3f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.819485 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17db029-fc5e-47e0-a010-23beeb370f3f-kube-api-access-hpgkm" (OuterVolumeSpecName: "kube-api-access-hpgkm") pod "f17db029-fc5e-47e0-a010-23beeb370f3f" (UID: "f17db029-fc5e-47e0-a010-23beeb370f3f"). InnerVolumeSpecName "kube-api-access-hpgkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.879168 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-config-data" (OuterVolumeSpecName: "config-data") pod "f17db029-fc5e-47e0-a010-23beeb370f3f" (UID: "f17db029-fc5e-47e0-a010-23beeb370f3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.911815 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpgkm\" (UniqueName: \"kubernetes.io/projected/f17db029-fc5e-47e0-a010-23beeb370f3f-kube-api-access-hpgkm\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.911870 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:39 crc kubenswrapper[4806]: I0217 15:47:39.911882 4806 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f17db029-fc5e-47e0-a010-23beeb370f3f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:40 crc kubenswrapper[4806]: I0217 15:47:40.470762 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-9qrgm" event={"ID":"f17db029-fc5e-47e0-a010-23beeb370f3f","Type":"ContainerDied","Data":"7c8d3b2367113abe1c76934b2e237d9cd52189e95b5da05da313315e36726493"} Feb 17 15:47:40 crc kubenswrapper[4806]: I0217 15:47:40.471328 4806 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c8d3b2367113abe1c76934b2e237d9cd52189e95b5da05da313315e36726493" Feb 17 15:47:40 crc kubenswrapper[4806]: I0217 15:47:40.470861 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-9qrgm" Feb 17 15:47:40 crc kubenswrapper[4806]: I0217 15:47:40.675184 4806 scope.go:117] "RemoveContainer" containerID="42ef9acdb2fb43af5b82233be748b8433d5161205fdb8c32cf7a4028002480a6" Feb 17 15:47:40 crc kubenswrapper[4806]: I0217 15:47:40.708046 4806 scope.go:117] "RemoveContainer" containerID="871b270ca2c1250f908eb66371c899e5cf8d9f41a0fec922079133f56ba0bad2" Feb 17 15:47:40 crc kubenswrapper[4806]: I0217 15:47:40.760529 4806 scope.go:117] "RemoveContainer" containerID="d8c949022626a671c3a88a99670a89d70eb2b8533a9641718ba4f5d37de884c4" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.248101 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:47:41 crc kubenswrapper[4806]: E0217 15:47:41.248741 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17db029-fc5e-47e0-a010-23beeb370f3f" containerName="glance-db-sync" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.248758 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17db029-fc5e-47e0-a010-23beeb370f3f" containerName="glance-db-sync" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.248904 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="f17db029-fc5e-47e0-a010-23beeb370f3f" containerName="glance-db-sync" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.249596 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.251283 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-xsb79" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.251778 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.251794 4806 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.261189 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.262925 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.270440 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.275999 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334395 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334477 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334525 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-dev\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334543 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-scripts\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334706 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-config-data\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334745 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-nvme\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334834 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-sys\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334865 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334883 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-run\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.334897 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.335009 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-logs\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.335045 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbnh2\" (UniqueName: \"kubernetes.io/projected/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-kube-api-access-kbnh2\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.335124 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-lib-modules\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.335152 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-httpd-run\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.436720 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzgsq\" (UniqueName: \"kubernetes.io/projected/12455ba6-19b9-49b0-a395-0963080088ae-kube-api-access-kzgsq\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.436770 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-sys\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.436800 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-dev\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.436820 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-scripts\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.436848 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.436864 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-dev\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437061 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-lib-modules\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437082 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-scripts\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437115 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-run\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437134 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-config-data\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437153 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-dev\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437331 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-nvme\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437396 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-nvme\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437442 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-config-data\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437531 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-nvme\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437563 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-sys\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437587 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437607 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-httpd-run\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437625 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-run\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437642 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437643 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-sys\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437664 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437686 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-run\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437693 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-logs\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437713 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbnh2\" (UniqueName: \"kubernetes.io/projected/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-kube-api-access-kbnh2\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437747 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-lib-modules\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437765 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-httpd-run\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437800 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-logs\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437824 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437843 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437866 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437893 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437911 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.437940 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.438043 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.438150 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.438164 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-lib-modules\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.438337 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-httpd-run\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.438354 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-logs\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.443230 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-scripts\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.443553 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-config-data\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.478359 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbnh2\" (UniqueName: \"kubernetes.io/projected/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-kube-api-access-kbnh2\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.491999 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.519008 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540132 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-run\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540175 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-dev\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540207 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-config-data\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540228 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-nvme\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540250 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-httpd-run\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540267 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540314 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-logs\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540333 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540353 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540376 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzgsq\" (UniqueName: \"kubernetes.io/projected/12455ba6-19b9-49b0-a395-0963080088ae-kube-api-access-kzgsq\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540391 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-sys\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540427 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540448 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-lib-modules\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540462 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-scripts\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.540899 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-httpd-run\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541092 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541238 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541475 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541529 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-sys\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541565 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-run\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541610 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541628 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-nvme\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541645 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-logs\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541683 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-dev\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.541689 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-lib-modules\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.544912 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-scripts\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.552002 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-config-data\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.562912 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzgsq\" (UniqueName: \"kubernetes.io/projected/12455ba6-19b9-49b0-a395-0963080088ae-kube-api-access-kzgsq\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.566892 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.571977 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.579601 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:41 crc kubenswrapper[4806]: I0217 15:47:41.866297 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.007084 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.118211 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:47:42 crc kubenswrapper[4806]: W0217 15:47:42.122821 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12455ba6_19b9_49b0_a395_0963080088ae.slice/crio-b40fc93c8af98c1b23bd819715879b64c2b103c36a521d922eabda742be5f162 WatchSource:0}: Error finding container b40fc93c8af98c1b23bd819715879b64c2b103c36a521d922eabda742be5f162: Status 404 returned error can't find the container with id b40fc93c8af98c1b23bd819715879b64c2b103c36a521d922eabda742be5f162 Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.322171 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.497147 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9eec6ed6-e6a7-448d-b54e-407a9b1afe42","Type":"ContainerStarted","Data":"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1"} Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.497188 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9eec6ed6-e6a7-448d-b54e-407a9b1afe42","Type":"ContainerStarted","Data":"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43"} Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.497199 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9eec6ed6-e6a7-448d-b54e-407a9b1afe42","Type":"ContainerStarted","Data":"a16aa06e59178832ff382bb0801c7a4713b4e5f70d8f2f3758f1d0f5c479e11b"} Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.497304 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-log" containerID="cri-o://0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43" gracePeriod=30 Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.497749 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-httpd" containerID="cri-o://c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1" gracePeriod=30 Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.500040 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"12455ba6-19b9-49b0-a395-0963080088ae","Type":"ContainerStarted","Data":"87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4"} Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.500071 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"12455ba6-19b9-49b0-a395-0963080088ae","Type":"ContainerStarted","Data":"e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2"} Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.500085 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"12455ba6-19b9-49b0-a395-0963080088ae","Type":"ContainerStarted","Data":"b40fc93c8af98c1b23bd819715879b64c2b103c36a521d922eabda742be5f162"} Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.522399 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=1.522379828 podStartE2EDuration="1.522379828s" podCreationTimestamp="2026-02-17 15:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:47:42.517908898 +0000 UTC m=+1624.048539349" watchObservedRunningTime="2026-02-17 15:47:42.522379828 +0000 UTC m=+1624.053010259" Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.551909 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.551887749 podStartE2EDuration="1.551887749s" podCreationTimestamp="2026-02-17 15:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:47:42.547384978 +0000 UTC m=+1624.078015389" watchObservedRunningTime="2026-02-17 15:47:42.551887749 +0000 UTC m=+1624.082518180" Feb 17 15:47:42 crc kubenswrapper[4806]: I0217 15:47:42.988474 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159629 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-var-locks-brick\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159717 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-run\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159748 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-nvme\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159738 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159795 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-scripts\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159818 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159824 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-lib-modules\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159862 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-run" (OuterVolumeSpecName: "run") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159884 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159927 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-httpd-run\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.159990 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-sys\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160049 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbnh2\" (UniqueName: \"kubernetes.io/projected/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-kube-api-access-kbnh2\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160066 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-sys" (OuterVolumeSpecName: "sys") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160138 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-logs\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160225 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160173 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160275 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-dev\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160393 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-logs" (OuterVolumeSpecName: "logs") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160419 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-dev" (OuterVolumeSpecName: "dev") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160472 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160509 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-iscsi\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.160603 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-config-data\") pod \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\" (UID: \"9eec6ed6-e6a7-448d-b54e-407a9b1afe42\") " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161165 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161652 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161709 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161728 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161745 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161761 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161831 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161848 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161862 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.161877 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.165829 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.174092 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-scripts" (OuterVolumeSpecName: "scripts") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.179180 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-kube-api-access-kbnh2" (OuterVolumeSpecName: "kube-api-access-kbnh2") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "kube-api-access-kbnh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.179353 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.211378 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-config-data" (OuterVolumeSpecName: "config-data") pod "9eec6ed6-e6a7-448d-b54e-407a9b1afe42" (UID: "9eec6ed6-e6a7-448d-b54e-407a9b1afe42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.262915 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.262946 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbnh2\" (UniqueName: \"kubernetes.io/projected/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-kube-api-access-kbnh2\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.262979 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.262994 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.263003 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9eec6ed6-e6a7-448d-b54e-407a9b1afe42-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.275807 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.281922 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.365192 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.365246 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:47:43 crc kubenswrapper[4806]: W0217 15:47:43.391074 4806 watcher.go:93] Error while processing event ("/sys/fs/cgroup/user.slice/user-0.slice/session-c22.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/user.slice/user-0.slice/session-c22.scope: no such file or directory Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.511210 4806 generic.go:334] "Generic (PLEG): container finished" podID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerID="c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1" exitCode=143 Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.511666 4806 generic.go:334] "Generic (PLEG): container finished" podID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerID="0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43" exitCode=143 Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.511293 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.511312 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9eec6ed6-e6a7-448d-b54e-407a9b1afe42","Type":"ContainerDied","Data":"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1"} Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.511727 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9eec6ed6-e6a7-448d-b54e-407a9b1afe42","Type":"ContainerDied","Data":"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43"} Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.511742 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"9eec6ed6-e6a7-448d-b54e-407a9b1afe42","Type":"ContainerDied","Data":"a16aa06e59178832ff382bb0801c7a4713b4e5f70d8f2f3758f1d0f5c479e11b"} Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.511758 4806 scope.go:117] "RemoveContainer" containerID="c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.540667 4806 scope.go:117] "RemoveContainer" containerID="0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.574139 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.576884 4806 scope.go:117] "RemoveContainer" containerID="c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1" Feb 17 15:47:43 crc kubenswrapper[4806]: E0217 15:47:43.577212 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1\": container with ID starting with c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1 not found: ID does not exist" containerID="c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.577238 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1"} err="failed to get container status \"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1\": rpc error: code = NotFound desc = could not find container \"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1\": container with ID starting with c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1 not found: ID does not exist" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.577257 4806 scope.go:117] "RemoveContainer" containerID="0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43" Feb 17 15:47:43 crc kubenswrapper[4806]: E0217 15:47:43.577542 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43\": container with ID starting with 0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43 not found: ID does not exist" containerID="0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.577567 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43"} err="failed to get container status \"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43\": rpc error: code = NotFound desc = could not find container \"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43\": container with ID starting with 0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43 not found: ID does not exist" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.577584 4806 scope.go:117] "RemoveContainer" containerID="c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.578154 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1"} err="failed to get container status \"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1\": rpc error: code = NotFound desc = could not find container \"c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1\": container with ID starting with c0f18e85bba1413bce94dddaf32f8966613e1cb10fe1a958958b5afe1fec76a1 not found: ID does not exist" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.578210 4806 scope.go:117] "RemoveContainer" containerID="0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.578699 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43"} err="failed to get container status \"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43\": rpc error: code = NotFound desc = could not find container \"0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43\": container with ID starting with 0f7235d25c1146194b3b10ab21cadb1334ca595d530ccc3977330d8ae35edc43 not found: ID does not exist" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.579143 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.600523 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:43 crc kubenswrapper[4806]: E0217 15:47:43.600803 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-httpd" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.600817 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-httpd" Feb 17 15:47:43 crc kubenswrapper[4806]: E0217 15:47:43.600844 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-log" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.600851 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-log" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.600995 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-httpd" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.601010 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" containerName="glance-log" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.601988 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.617974 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774053 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-run\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774111 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774129 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f6e316-f240-48a8-8f37-7e0fa651c469-logs\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774148 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774332 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774381 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9f6e316-f240-48a8-8f37-7e0fa651c469-scripts\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774479 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9f6e316-f240-48a8-8f37-7e0fa651c469-httpd-run\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774538 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-sys\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774561 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774576 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-dev\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774596 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxx6\" (UniqueName: \"kubernetes.io/projected/b9f6e316-f240-48a8-8f37-7e0fa651c469-kube-api-access-6xxx6\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774621 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f6e316-f240-48a8-8f37-7e0fa651c469-config-data\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774687 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.774711 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-lib-modules\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876631 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9f6e316-f240-48a8-8f37-7e0fa651c469-httpd-run\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876713 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-sys\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876746 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-dev\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876777 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxx6\" (UniqueName: \"kubernetes.io/projected/b9f6e316-f240-48a8-8f37-7e0fa651c469-kube-api-access-6xxx6\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876810 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876851 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f6e316-f240-48a8-8f37-7e0fa651c469-config-data\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876913 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-lib-modules\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876926 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-dev\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.876960 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877010 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-sys\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877245 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877274 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-run\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877316 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877336 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f6e316-f240-48a8-8f37-7e0fa651c469-logs\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877366 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877435 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-run\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877449 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877499 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877537 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-etc-nvme\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877543 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9f6e316-f240-48a8-8f37-7e0fa651c469-scripts\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877662 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9f6e316-f240-48a8-8f37-7e0fa651c469-httpd-run\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877877 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877916 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-lib-modules\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877943 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b9f6e316-f240-48a8-8f37-7e0fa651c469-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.877890 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9f6e316-f240-48a8-8f37-7e0fa651c469-logs\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.886312 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9f6e316-f240-48a8-8f37-7e0fa651c469-scripts\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.886516 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9f6e316-f240-48a8-8f37-7e0fa651c469-config-data\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.893163 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxx6\" (UniqueName: \"kubernetes.io/projected/b9f6e316-f240-48a8-8f37-7e0fa651c469-kube-api-access-6xxx6\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.911103 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.915131 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-1\" (UID: \"b9f6e316-f240-48a8-8f37-7e0fa651c469\") " pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:43 crc kubenswrapper[4806]: I0217 15:47:43.926711 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:44 crc kubenswrapper[4806]: I0217 15:47:44.419257 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Feb 17 15:47:44 crc kubenswrapper[4806]: W0217 15:47:44.437927 4806 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9f6e316_f240_48a8_8f37_7e0fa651c469.slice/crio-3e0a4af5bdd0aa223337bc4cdee33122d84cdb1f394a76f2b46505938b2ae46f WatchSource:0}: Error finding container 3e0a4af5bdd0aa223337bc4cdee33122d84cdb1f394a76f2b46505938b2ae46f: Status 404 returned error can't find the container with id 3e0a4af5bdd0aa223337bc4cdee33122d84cdb1f394a76f2b46505938b2ae46f Feb 17 15:47:44 crc kubenswrapper[4806]: I0217 15:47:44.522621 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b9f6e316-f240-48a8-8f37-7e0fa651c469","Type":"ContainerStarted","Data":"3e0a4af5bdd0aa223337bc4cdee33122d84cdb1f394a76f2b46505938b2ae46f"} Feb 17 15:47:45 crc kubenswrapper[4806]: I0217 15:47:45.168006 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eec6ed6-e6a7-448d-b54e-407a9b1afe42" path="/var/lib/kubelet/pods/9eec6ed6-e6a7-448d-b54e-407a9b1afe42/volumes" Feb 17 15:47:45 crc kubenswrapper[4806]: I0217 15:47:45.537260 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b9f6e316-f240-48a8-8f37-7e0fa651c469","Type":"ContainerStarted","Data":"762cf11e7f910bc5a4a8d8e665281f12cc5eea4ee7303ac326fc018fe9caff56"} Feb 17 15:47:45 crc kubenswrapper[4806]: I0217 15:47:45.537901 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"b9f6e316-f240-48a8-8f37-7e0fa651c469","Type":"ContainerStarted","Data":"4fcd132c0f660642660762357b0f3c7bafb5328941bcd9b9536bd7710cee2db7"} Feb 17 15:47:45 crc kubenswrapper[4806]: I0217 15:47:45.574331 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=2.57430712 podStartE2EDuration="2.57430712s" podCreationTimestamp="2026-02-17 15:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:47:45.563183765 +0000 UTC m=+1627.093814216" watchObservedRunningTime="2026-02-17 15:47:45.57430712 +0000 UTC m=+1627.104937541" Feb 17 15:47:51 crc kubenswrapper[4806]: I0217 15:47:51.866692 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:51 crc kubenswrapper[4806]: I0217 15:47:51.867733 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:51 crc kubenswrapper[4806]: I0217 15:47:51.903815 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:51 crc kubenswrapper[4806]: I0217 15:47:51.939347 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:52 crc kubenswrapper[4806]: I0217 15:47:52.595940 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:52 crc kubenswrapper[4806]: I0217 15:47:52.596008 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:53 crc kubenswrapper[4806]: I0217 15:47:53.928055 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:53 crc kubenswrapper[4806]: I0217 15:47:53.928216 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:53 crc kubenswrapper[4806]: I0217 15:47:53.964060 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:53 crc kubenswrapper[4806]: I0217 15:47:53.989909 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:54 crc kubenswrapper[4806]: I0217 15:47:54.458853 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:54 crc kubenswrapper[4806]: I0217 15:47:54.486013 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:47:54 crc kubenswrapper[4806]: I0217 15:47:54.610945 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:54 crc kubenswrapper[4806]: I0217 15:47:54.610978 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:56 crc kubenswrapper[4806]: I0217 15:47:56.499706 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:56 crc kubenswrapper[4806]: I0217 15:47:56.622499 4806 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 15:47:56 crc kubenswrapper[4806]: I0217 15:47:56.661550 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Feb 17 15:47:56 crc kubenswrapper[4806]: I0217 15:47:56.707618 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:47:56 crc kubenswrapper[4806]: I0217 15:47:56.707935 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-log" containerID="cri-o://e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2" gracePeriod=30 Feb 17 15:47:56 crc kubenswrapper[4806]: I0217 15:47:56.708071 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-httpd" containerID="cri-o://87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4" gracePeriod=30 Feb 17 15:47:57 crc kubenswrapper[4806]: I0217 15:47:57.637283 4806 generic.go:334] "Generic (PLEG): container finished" podID="12455ba6-19b9-49b0-a395-0963080088ae" containerID="e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2" exitCode=143 Feb 17 15:47:57 crc kubenswrapper[4806]: I0217 15:47:57.638997 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"12455ba6-19b9-49b0-a395-0963080088ae","Type":"ContainerDied","Data":"e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2"} Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.344425 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479316 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-run\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479478 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-lib-modules\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479480 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-run" (OuterVolumeSpecName: "run") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479525 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-var-locks-brick\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479558 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479571 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-logs\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479598 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479639 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzgsq\" (UniqueName: \"kubernetes.io/projected/12455ba6-19b9-49b0-a395-0963080088ae-kube-api-access-kzgsq\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479695 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-sys\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479813 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-scripts\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479864 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-dev\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479924 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-iscsi\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.479997 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-config-data\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480056 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-nvme\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480109 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480141 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-httpd-run\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480167 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"12455ba6-19b9-49b0-a395-0963080088ae\" (UID: \"12455ba6-19b9-49b0-a395-0963080088ae\") " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480180 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-logs" (OuterVolumeSpecName: "logs") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480226 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480255 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-sys" (OuterVolumeSpecName: "sys") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480695 4806 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480746 4806 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-lib-modules\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480777 4806 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-var-locks-brick\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480802 4806 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-logs\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480819 4806 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-sys\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.480835 4806 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-iscsi\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.481872 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.481947 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-dev" (OuterVolumeSpecName: "dev") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.482207 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.489356 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.489955 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12455ba6-19b9-49b0-a395-0963080088ae-kube-api-access-kzgsq" (OuterVolumeSpecName: "kube-api-access-kzgsq") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "kube-api-access-kzgsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.491447 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.500859 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-scripts" (OuterVolumeSpecName: "scripts") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.549151 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-config-data" (OuterVolumeSpecName: "config-data") pod "12455ba6-19b9-49b0-a395-0963080088ae" (UID: "12455ba6-19b9-49b0-a395-0963080088ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582165 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzgsq\" (UniqueName: \"kubernetes.io/projected/12455ba6-19b9-49b0-a395-0963080088ae-kube-api-access-kzgsq\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582356 4806 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582466 4806 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-dev\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582544 4806 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12455ba6-19b9-49b0-a395-0963080088ae-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582618 4806 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/12455ba6-19b9-49b0-a395-0963080088ae-etc-nvme\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582718 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582831 4806 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12455ba6-19b9-49b0-a395-0963080088ae-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.582945 4806 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.605229 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.605643 4806 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.666566 4806 generic.go:334] "Generic (PLEG): container finished" podID="12455ba6-19b9-49b0-a395-0963080088ae" containerID="87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4" exitCode=0 Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.666624 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"12455ba6-19b9-49b0-a395-0963080088ae","Type":"ContainerDied","Data":"87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4"} Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.666669 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"12455ba6-19b9-49b0-a395-0963080088ae","Type":"ContainerDied","Data":"b40fc93c8af98c1b23bd819715879b64c2b103c36a521d922eabda742be5f162"} Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.666665 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.666707 4806 scope.go:117] "RemoveContainer" containerID="87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.684232 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.684264 4806 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.697040 4806 scope.go:117] "RemoveContainer" containerID="e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.710030 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.735272 4806 scope.go:117] "RemoveContainer" containerID="87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.735789 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:48:00 crc kubenswrapper[4806]: E0217 15:48:00.736115 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4\": container with ID starting with 87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4 not found: ID does not exist" containerID="87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.736186 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4"} err="failed to get container status \"87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4\": rpc error: code = NotFound desc = could not find container \"87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4\": container with ID starting with 87e3042db4e23709c06955ecc111bb50a082a0d1cb2acd40baa9a5449168a8d4 not found: ID does not exist" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.736229 4806 scope.go:117] "RemoveContainer" containerID="e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2" Feb 17 15:48:00 crc kubenswrapper[4806]: E0217 15:48:00.736755 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2\": container with ID starting with e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2 not found: ID does not exist" containerID="e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.736801 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2"} err="failed to get container status \"e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2\": rpc error: code = NotFound desc = could not find container \"e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2\": container with ID starting with e421528e2c96658249770283f88a527d34ebdbd8248ebcb87b817b8f2afc87e2 not found: ID does not exist" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.745613 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:48:00 crc kubenswrapper[4806]: E0217 15:48:00.745972 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-httpd" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.745996 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-httpd" Feb 17 15:48:00 crc kubenswrapper[4806]: E0217 15:48:00.746023 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-log" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.746033 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-log" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.746216 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-httpd" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.746233 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="12455ba6-19b9-49b0-a395-0963080088ae" containerName="glance-log" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.747264 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.759861 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.887835 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-etc-nvme\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.887909 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.887998 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-run\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888044 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888083 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-sys\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888143 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqsfp\" (UniqueName: \"kubernetes.io/projected/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-kube-api-access-cqsfp\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888188 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888218 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-config-data\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888244 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-lib-modules\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888291 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-httpd-run\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888342 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-dev\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888371 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-scripts\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888764 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-logs\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.888835 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.990690 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-dev\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.990751 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-scripts\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.990784 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-logs\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.990845 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.990886 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-dev\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.990983 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-etc-nvme\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991044 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991141 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-run\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991142 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991183 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991252 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-sys\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991394 4806 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991437 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqsfp\" (UniqueName: \"kubernetes.io/projected/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-kube-api-access-cqsfp\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991517 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991547 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-lib-modules\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991595 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-config-data\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991254 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991653 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-httpd-run\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991183 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-etc-nvme\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991813 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-logs\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991893 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-lib-modules\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991907 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991953 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-sys\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.991967 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-run\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.992379 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-httpd-run\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.996696 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-scripts\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:00 crc kubenswrapper[4806]: I0217 15:48:00.999809 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-config-data\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:01 crc kubenswrapper[4806]: I0217 15:48:01.016037 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqsfp\" (UniqueName: \"kubernetes.io/projected/00ee37b0-a6e2-4bb0-94d6-cd3be23c533e-kube-api-access-cqsfp\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:01 crc kubenswrapper[4806]: I0217 15:48:01.026848 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:01 crc kubenswrapper[4806]: I0217 15:48:01.029817 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-0\" (UID: \"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e\") " pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:01 crc kubenswrapper[4806]: I0217 15:48:01.081517 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:01 crc kubenswrapper[4806]: I0217 15:48:01.174760 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12455ba6-19b9-49b0-a395-0963080088ae" path="/var/lib/kubelet/pods/12455ba6-19b9-49b0-a395-0963080088ae/volumes" Feb 17 15:48:01 crc kubenswrapper[4806]: I0217 15:48:01.645066 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Feb 17 15:48:01 crc kubenswrapper[4806]: I0217 15:48:01.676373 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e","Type":"ContainerStarted","Data":"a41fbc09f0cf171cd4dcffbafcada775e06ae009471f61a0be427e209e3a0253"} Feb 17 15:48:02 crc kubenswrapper[4806]: I0217 15:48:02.716197 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e","Type":"ContainerStarted","Data":"2cfae5f426f6ac2e8b2ec4b8e5b870221a9b3bd28369b009931418234b4c06ae"} Feb 17 15:48:02 crc kubenswrapper[4806]: I0217 15:48:02.717095 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"00ee37b0-a6e2-4bb0-94d6-cd3be23c533e","Type":"ContainerStarted","Data":"c2bea945a0b199b7a35287e14e7a1a3a3384d36ef12f1af1c3a455b6e379b861"} Feb 17 15:48:02 crc kubenswrapper[4806]: I0217 15:48:02.766368 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.766340727 podStartE2EDuration="2.766340727s" podCreationTimestamp="2026-02-17 15:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 15:48:02.751675434 +0000 UTC m=+1644.282305895" watchObservedRunningTime="2026-02-17 15:48:02.766340727 +0000 UTC m=+1644.296971178" Feb 17 15:48:04 crc kubenswrapper[4806]: I0217 15:48:04.784393 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:48:04 crc kubenswrapper[4806]: I0217 15:48:04.784862 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:48:11 crc kubenswrapper[4806]: I0217 15:48:11.082185 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:11 crc kubenswrapper[4806]: I0217 15:48:11.083858 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:11 crc kubenswrapper[4806]: I0217 15:48:11.113476 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:11 crc kubenswrapper[4806]: I0217 15:48:11.139390 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:11 crc kubenswrapper[4806]: I0217 15:48:11.587986 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:11 crc kubenswrapper[4806]: I0217 15:48:11.588027 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:13 crc kubenswrapper[4806]: I0217 15:48:13.377172 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:13 crc kubenswrapper[4806]: I0217 15:48:13.380032 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Feb 17 15:48:34 crc kubenswrapper[4806]: I0217 15:48:34.784772 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:48:34 crc kubenswrapper[4806]: I0217 15:48:34.785304 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:48:34 crc kubenswrapper[4806]: I0217 15:48:34.785365 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:48:34 crc kubenswrapper[4806]: I0217 15:48:34.786170 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:48:34 crc kubenswrapper[4806]: I0217 15:48:34.786242 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" gracePeriod=600 Feb 17 15:48:34 crc kubenswrapper[4806]: E0217 15:48:34.914256 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:48:35 crc kubenswrapper[4806]: I0217 15:48:35.823746 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" exitCode=0 Feb 17 15:48:35 crc kubenswrapper[4806]: I0217 15:48:35.823832 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b"} Feb 17 15:48:35 crc kubenswrapper[4806]: I0217 15:48:35.824118 4806 scope.go:117] "RemoveContainer" containerID="50c505eb17a168c57e5352f3069d8cb3bd254804a076e25bc12d0024ccfccf4f" Feb 17 15:48:35 crc kubenswrapper[4806]: I0217 15:48:35.824920 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:48:35 crc kubenswrapper[4806]: E0217 15:48:35.825517 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:48:46 crc kubenswrapper[4806]: I0217 15:48:46.166086 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:48:46 crc kubenswrapper[4806]: E0217 15:48:46.167126 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:48:57 crc kubenswrapper[4806]: I0217 15:48:57.161445 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:48:57 crc kubenswrapper[4806]: E0217 15:48:57.164673 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:49:08 crc kubenswrapper[4806]: I0217 15:49:08.161896 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:49:08 crc kubenswrapper[4806]: E0217 15:49:08.165742 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:49:16 crc kubenswrapper[4806]: E0217 15:49:16.972849 4806 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 17 15:49:22 crc kubenswrapper[4806]: I0217 15:49:22.161303 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:49:22 crc kubenswrapper[4806]: E0217 15:49:22.162021 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:49:26 crc kubenswrapper[4806]: I0217 15:49:26.478985 4806 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/swift-proxy-5f6df75b65-sh9ht" podUID="5813437e-d2ad-4742-8598-5d78f8026604" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 17 15:49:36 crc kubenswrapper[4806]: I0217 15:49:36.162693 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:49:36 crc kubenswrapper[4806]: E0217 15:49:36.163498 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:49:50 crc kubenswrapper[4806]: I0217 15:49:50.161710 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:49:50 crc kubenswrapper[4806]: E0217 15:49:50.162787 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:50:01 crc kubenswrapper[4806]: I0217 15:50:01.161675 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:50:01 crc kubenswrapper[4806]: E0217 15:50:01.162758 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:50:15 crc kubenswrapper[4806]: I0217 15:50:15.161679 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:50:15 crc kubenswrapper[4806]: E0217 15:50:15.163081 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:50:26 crc kubenswrapper[4806]: I0217 15:50:26.161607 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:50:26 crc kubenswrapper[4806]: E0217 15:50:26.162460 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:50:40 crc kubenswrapper[4806]: I0217 15:50:40.161212 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:50:40 crc kubenswrapper[4806]: E0217 15:50:40.162215 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:50:52 crc kubenswrapper[4806]: I0217 15:50:52.161164 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:50:52 crc kubenswrapper[4806]: E0217 15:50:52.162148 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:51:07 crc kubenswrapper[4806]: I0217 15:51:07.162192 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:51:07 crc kubenswrapper[4806]: E0217 15:51:07.163373 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:51:21 crc kubenswrapper[4806]: I0217 15:51:21.161034 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:51:21 crc kubenswrapper[4806]: E0217 15:51:21.161980 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:51:32 crc kubenswrapper[4806]: I0217 15:51:32.161679 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:51:32 crc kubenswrapper[4806]: E0217 15:51:32.162393 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:51:45 crc kubenswrapper[4806]: I0217 15:51:45.161567 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:51:45 crc kubenswrapper[4806]: E0217 15:51:45.162600 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.483425 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ck45l/must-gather-n8vp6"] Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.485034 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.487025 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ck45l"/"openshift-service-ca.crt" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.487132 4806 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ck45l"/"kube-root-ca.crt" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.487482 4806 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ck45l"/"default-dockercfg-5l5kx" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.502258 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ck45l/must-gather-n8vp6"] Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.637932 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpbk\" (UniqueName: \"kubernetes.io/projected/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-kube-api-access-twpbk\") pod \"must-gather-n8vp6\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.638012 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-must-gather-output\") pod \"must-gather-n8vp6\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.739578 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpbk\" (UniqueName: \"kubernetes.io/projected/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-kube-api-access-twpbk\") pod \"must-gather-n8vp6\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.739630 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-must-gather-output\") pod \"must-gather-n8vp6\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.740013 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-must-gather-output\") pod \"must-gather-n8vp6\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.761562 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpbk\" (UniqueName: \"kubernetes.io/projected/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-kube-api-access-twpbk\") pod \"must-gather-n8vp6\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:51 crc kubenswrapper[4806]: I0217 15:51:51.800847 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:51:52 crc kubenswrapper[4806]: I0217 15:51:52.311891 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ck45l/must-gather-n8vp6"] Feb 17 15:51:52 crc kubenswrapper[4806]: I0217 15:51:52.317944 4806 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:51:52 crc kubenswrapper[4806]: I0217 15:51:52.731229 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ck45l/must-gather-n8vp6" event={"ID":"9590bf64-6c80-4f52-b8bb-36801d9b0b3e","Type":"ContainerStarted","Data":"702be207722a4a62c12195c7257ddee36639874f648db33969af768fd27a86d5"} Feb 17 15:51:58 crc kubenswrapper[4806]: I0217 15:51:58.785004 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ck45l/must-gather-n8vp6" event={"ID":"9590bf64-6c80-4f52-b8bb-36801d9b0b3e","Type":"ContainerStarted","Data":"67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559"} Feb 17 15:51:58 crc kubenswrapper[4806]: I0217 15:51:58.785657 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ck45l/must-gather-n8vp6" event={"ID":"9590bf64-6c80-4f52-b8bb-36801d9b0b3e","Type":"ContainerStarted","Data":"66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4"} Feb 17 15:51:58 crc kubenswrapper[4806]: I0217 15:51:58.808450 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ck45l/must-gather-n8vp6" podStartSLOduration=2.47045703 podStartE2EDuration="7.808431589s" podCreationTimestamp="2026-02-17 15:51:51 +0000 UTC" firstStartedPulling="2026-02-17 15:51:52.317558283 +0000 UTC m=+1873.848188694" lastFinishedPulling="2026-02-17 15:51:57.655532832 +0000 UTC m=+1879.186163253" observedRunningTime="2026-02-17 15:51:58.800997785 +0000 UTC m=+1880.331628206" watchObservedRunningTime="2026-02-17 15:51:58.808431589 +0000 UTC m=+1880.339062010" Feb 17 15:52:00 crc kubenswrapper[4806]: I0217 15:52:00.160979 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:52:00 crc kubenswrapper[4806]: E0217 15:52:00.161267 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:52:15 crc kubenswrapper[4806]: I0217 15:52:15.163567 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:52:15 crc kubenswrapper[4806]: E0217 15:52:15.164599 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.692041 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rtxk4"] Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.693746 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.757221 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rtxk4"] Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.878991 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-utilities\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.879104 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9s8x\" (UniqueName: \"kubernetes.io/projected/4ceefb74-3a06-4a8a-874a-abc0950a7df9-kube-api-access-s9s8x\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.879205 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-catalog-content\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.980241 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-utilities\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.980792 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9s8x\" (UniqueName: \"kubernetes.io/projected/4ceefb74-3a06-4a8a-874a-abc0950a7df9-kube-api-access-s9s8x\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.980864 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-utilities\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.980907 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-catalog-content\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:22 crc kubenswrapper[4806]: I0217 15:52:22.981292 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-catalog-content\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:23 crc kubenswrapper[4806]: I0217 15:52:23.024275 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9s8x\" (UniqueName: \"kubernetes.io/projected/4ceefb74-3a06-4a8a-874a-abc0950a7df9-kube-api-access-s9s8x\") pod \"community-operators-rtxk4\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:23 crc kubenswrapper[4806]: I0217 15:52:23.324568 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:23 crc kubenswrapper[4806]: I0217 15:52:23.740150 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rtxk4"] Feb 17 15:52:23 crc kubenswrapper[4806]: I0217 15:52:23.992388 4806 generic.go:334] "Generic (PLEG): container finished" podID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerID="5c7449f196ffe1ddc17a8b3c5f261d6b4156151c95fbe709d71099c3f69924c3" exitCode=0 Feb 17 15:52:23 crc kubenswrapper[4806]: I0217 15:52:23.992480 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtxk4" event={"ID":"4ceefb74-3a06-4a8a-874a-abc0950a7df9","Type":"ContainerDied","Data":"5c7449f196ffe1ddc17a8b3c5f261d6b4156151c95fbe709d71099c3f69924c3"} Feb 17 15:52:23 crc kubenswrapper[4806]: I0217 15:52:23.992546 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtxk4" event={"ID":"4ceefb74-3a06-4a8a-874a-abc0950a7df9","Type":"ContainerStarted","Data":"740ce272be32dda90921fc2422309adb5fb36c66a716fdf5f4450d56d27f3fd1"} Feb 17 15:52:25 crc kubenswrapper[4806]: I0217 15:52:25.001391 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtxk4" event={"ID":"4ceefb74-3a06-4a8a-874a-abc0950a7df9","Type":"ContainerStarted","Data":"1569affadee37a81c253f0a28653b8783b4b8ccda1187b58fec29d838c5b2e46"} Feb 17 15:52:26 crc kubenswrapper[4806]: I0217 15:52:26.011099 4806 generic.go:334] "Generic (PLEG): container finished" podID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerID="1569affadee37a81c253f0a28653b8783b4b8ccda1187b58fec29d838c5b2e46" exitCode=0 Feb 17 15:52:26 crc kubenswrapper[4806]: I0217 15:52:26.011223 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtxk4" event={"ID":"4ceefb74-3a06-4a8a-874a-abc0950a7df9","Type":"ContainerDied","Data":"1569affadee37a81c253f0a28653b8783b4b8ccda1187b58fec29d838c5b2e46"} Feb 17 15:52:27 crc kubenswrapper[4806]: I0217 15:52:27.019301 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtxk4" event={"ID":"4ceefb74-3a06-4a8a-874a-abc0950a7df9","Type":"ContainerStarted","Data":"4ee7fd26d7af9d8d4e5b6f4a3190aa9d6479f0b5e646ddb8957f5b868c33c029"} Feb 17 15:52:27 crc kubenswrapper[4806]: I0217 15:52:27.054835 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rtxk4" podStartSLOduration=2.627819712 podStartE2EDuration="5.054804544s" podCreationTimestamp="2026-02-17 15:52:22 +0000 UTC" firstStartedPulling="2026-02-17 15:52:23.996755272 +0000 UTC m=+1905.527385683" lastFinishedPulling="2026-02-17 15:52:26.423740104 +0000 UTC m=+1907.954370515" observedRunningTime="2026-02-17 15:52:27.03687997 +0000 UTC m=+1908.567510401" watchObservedRunningTime="2026-02-17 15:52:27.054804544 +0000 UTC m=+1908.585434955" Feb 17 15:52:29 crc kubenswrapper[4806]: I0217 15:52:29.164883 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:52:29 crc kubenswrapper[4806]: E0217 15:52:29.165363 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:52:33 crc kubenswrapper[4806]: I0217 15:52:33.325122 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:33 crc kubenswrapper[4806]: I0217 15:52:33.325473 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:33 crc kubenswrapper[4806]: I0217 15:52:33.377361 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:34 crc kubenswrapper[4806]: I0217 15:52:34.132321 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:36 crc kubenswrapper[4806]: I0217 15:52:36.885941 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rtxk4"] Feb 17 15:52:36 crc kubenswrapper[4806]: I0217 15:52:36.886780 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rtxk4" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="registry-server" containerID="cri-o://4ee7fd26d7af9d8d4e5b6f4a3190aa9d6479f0b5e646ddb8957f5b868c33c029" gracePeriod=2 Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.107983 4806 generic.go:334] "Generic (PLEG): container finished" podID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerID="4ee7fd26d7af9d8d4e5b6f4a3190aa9d6479f0b5e646ddb8957f5b868c33c029" exitCode=0 Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.108016 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtxk4" event={"ID":"4ceefb74-3a06-4a8a-874a-abc0950a7df9","Type":"ContainerDied","Data":"4ee7fd26d7af9d8d4e5b6f4a3190aa9d6479f0b5e646ddb8957f5b868c33c029"} Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.317729 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.510555 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9s8x\" (UniqueName: \"kubernetes.io/projected/4ceefb74-3a06-4a8a-874a-abc0950a7df9-kube-api-access-s9s8x\") pod \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.510746 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-catalog-content\") pod \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.510808 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-utilities\") pod \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\" (UID: \"4ceefb74-3a06-4a8a-874a-abc0950a7df9\") " Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.511415 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-utilities" (OuterVolumeSpecName: "utilities") pod "4ceefb74-3a06-4a8a-874a-abc0950a7df9" (UID: "4ceefb74-3a06-4a8a-874a-abc0950a7df9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.518861 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ceefb74-3a06-4a8a-874a-abc0950a7df9-kube-api-access-s9s8x" (OuterVolumeSpecName: "kube-api-access-s9s8x") pod "4ceefb74-3a06-4a8a-874a-abc0950a7df9" (UID: "4ceefb74-3a06-4a8a-874a-abc0950a7df9"). InnerVolumeSpecName "kube-api-access-s9s8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.562532 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ceefb74-3a06-4a8a-874a-abc0950a7df9" (UID: "4ceefb74-3a06-4a8a-874a-abc0950a7df9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.612921 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.612964 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ceefb74-3a06-4a8a-874a-abc0950a7df9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:52:37 crc kubenswrapper[4806]: I0217 15:52:37.612980 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9s8x\" (UniqueName: \"kubernetes.io/projected/4ceefb74-3a06-4a8a-874a-abc0950a7df9-kube-api-access-s9s8x\") on node \"crc\" DevicePath \"\"" Feb 17 15:52:38 crc kubenswrapper[4806]: I0217 15:52:38.120365 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtxk4" event={"ID":"4ceefb74-3a06-4a8a-874a-abc0950a7df9","Type":"ContainerDied","Data":"740ce272be32dda90921fc2422309adb5fb36c66a716fdf5f4450d56d27f3fd1"} Feb 17 15:52:38 crc kubenswrapper[4806]: I0217 15:52:38.120449 4806 scope.go:117] "RemoveContainer" containerID="4ee7fd26d7af9d8d4e5b6f4a3190aa9d6479f0b5e646ddb8957f5b868c33c029" Feb 17 15:52:38 crc kubenswrapper[4806]: I0217 15:52:38.120451 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtxk4" Feb 17 15:52:38 crc kubenswrapper[4806]: I0217 15:52:38.139498 4806 scope.go:117] "RemoveContainer" containerID="1569affadee37a81c253f0a28653b8783b4b8ccda1187b58fec29d838c5b2e46" Feb 17 15:52:38 crc kubenswrapper[4806]: I0217 15:52:38.150715 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rtxk4"] Feb 17 15:52:38 crc kubenswrapper[4806]: I0217 15:52:38.161294 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rtxk4"] Feb 17 15:52:38 crc kubenswrapper[4806]: I0217 15:52:38.174234 4806 scope.go:117] "RemoveContainer" containerID="5c7449f196ffe1ddc17a8b3c5f261d6b4156151c95fbe709d71099c3f69924c3" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.169610 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" path="/var/lib/kubelet/pods/4ceefb74-3a06-4a8a-874a-abc0950a7df9/volumes" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.201304 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4_0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3/util/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.357788 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4_0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3/pull/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.359419 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4_0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3/util/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.420783 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4_0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3/pull/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.587315 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4_0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3/util/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.615108 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4_0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3/pull/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.618199 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00f678af849290cc91e61669bf5802fefbf587118b725bb47400941310p4pg4_0c8efbeb-8b3e-49c4-ab0b-0e5beb2211c3/extract/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.745614 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6_f5a2c8b8-8042-4f62-a5d5-bed880f65261/util/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.917280 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6_f5a2c8b8-8042-4f62-a5d5-bed880f65261/pull/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.929875 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6_f5a2c8b8-8042-4f62-a5d5-bed880f65261/util/0.log" Feb 17 15:52:39 crc kubenswrapper[4806]: I0217 15:52:39.955058 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6_f5a2c8b8-8042-4f62-a5d5-bed880f65261/pull/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.109785 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6_f5a2c8b8-8042-4f62-a5d5-bed880f65261/pull/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.116737 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6_f5a2c8b8-8042-4f62-a5d5-bed880f65261/util/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.134848 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_28b7d49ba3f5aa7c44a31335a323fa1f9d605fc09146e7ae2d76f69e26hrsz6_f5a2c8b8-8042-4f62-a5d5-bed880f65261/extract/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.296251 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k_eebc9abb-adc9-47ae-a370-fccd9e91a4da/util/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.458864 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k_eebc9abb-adc9-47ae-a370-fccd9e91a4da/util/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.478003 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k_eebc9abb-adc9-47ae-a370-fccd9e91a4da/pull/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.494979 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k_eebc9abb-adc9-47ae-a370-fccd9e91a4da/pull/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.689481 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k_eebc9abb-adc9-47ae-a370-fccd9e91a4da/util/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.694231 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k_eebc9abb-adc9-47ae-a370-fccd9e91a4da/pull/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.709889 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_414e4b97d51928ab5198aefe9c55ad0e5126b10e101f1abf7c39f91e067xg7k_eebc9abb-adc9-47ae-a370-fccd9e91a4da/extract/0.log" Feb 17 15:52:40 crc kubenswrapper[4806]: I0217 15:52:40.870172 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj_e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322/util/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.036157 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj_e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322/util/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.078201 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj_e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322/pull/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.094425 4806 scope.go:117] "RemoveContainer" containerID="a1e673933a0fd4645a200f17c4b65685d0d3ef15e9a2ca832992ef4995aaf7f6" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.112592 4806 scope.go:117] "RemoveContainer" containerID="7a0bc37c58dbb89f40d0ed11afb1c0c82ff337cdd4552ae429cde519a25118df" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.135878 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj_e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322/pull/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.142467 4806 scope.go:117] "RemoveContainer" containerID="4a531e2efe9ccc5c53c6be288fc4ba95479b72d50a772c81c720862ace5cee70" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.161459 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:52:41 crc kubenswrapper[4806]: E0217 15:52:41.161833 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.230457 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj_e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322/pull/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.288952 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj_e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322/util/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.312674 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d4a9dcdacca526e9e6f5178499d735b8b4c6fd7e962363bed41c17b9bw2nlj_e0da8e6d-ef0c-4cdc-b89e-0cb3b45c5322/extract/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.405182 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw_c7062487-b8c9-4591-9d77-395a752598ce/util/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.687347 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw_c7062487-b8c9-4591-9d77-395a752598ce/pull/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.752257 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw_c7062487-b8c9-4591-9d77-395a752598ce/pull/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.753199 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw_c7062487-b8c9-4591-9d77-395a752598ce/util/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.884894 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw_c7062487-b8c9-4591-9d77-395a752598ce/util/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.908665 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw_c7062487-b8c9-4591-9d77-395a752598ce/extract/0.log" Feb 17 15:52:41 crc kubenswrapper[4806]: I0217 15:52:41.917869 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_61c5ec7b1d36b27470f0fbf6863c049f5b901f81228536cfdc751ed4729wzbw_c7062487-b8c9-4591-9d77-395a752598ce/pull/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.055599 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk_da3c84f4-8168-4262-8636-79b9c7bd7d4d/util/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.208249 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk_da3c84f4-8168-4262-8636-79b9c7bd7d4d/util/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.214866 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk_da3c84f4-8168-4262-8636-79b9c7bd7d4d/pull/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.240477 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk_da3c84f4-8168-4262-8636-79b9c7bd7d4d/pull/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.401168 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk_da3c84f4-8168-4262-8636-79b9c7bd7d4d/util/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.418515 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk_da3c84f4-8168-4262-8636-79b9c7bd7d4d/extract/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.454345 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z_e94031f9-ac0c-4950-b703-2133541e2cf1/util/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.472503 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590m7brk_da3c84f4-8168-4262-8636-79b9c7bd7d4d/pull/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.651532 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z_e94031f9-ac0c-4950-b703-2133541e2cf1/util/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.652823 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z_e94031f9-ac0c-4950-b703-2133541e2cf1/pull/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.657580 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z_e94031f9-ac0c-4950-b703-2133541e2cf1/pull/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.812716 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z_e94031f9-ac0c-4950-b703-2133541e2cf1/extract/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.835183 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z_e94031f9-ac0c-4950-b703-2133541e2cf1/pull/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.841660 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_97c0565e024cc42acedf7d327ee11bcdec84a0c0e4ddd546647cdbfd63vml4z_e94031f9-ac0c-4950-b703-2133541e2cf1/util/0.log" Feb 17 15:52:42 crc kubenswrapper[4806]: I0217 15:52:42.955679 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84d4cfd9dd-bwz79_af129dac-8ce3-4199-85d4-e07ad5adf02b/manager/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.063685 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7684c4dfd4-hc257_c51c0d3e-e13f-4cdb-a842-d27644641a79/manager/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.084230 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-index-5v785_96c88e0a-5c93-40c3-b3d4-91cfdb8b6148/registry-server/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.259868 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-9b5kl_4da12153-8e2b-42f8-b498-15f3035a3769/registry-server/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.331253 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-8464bf4b7b-r2bsp_4156966d-8f2a-4e04-8484-779309f87ee9/manager/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.352379 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-hj996_dc89bc49-330e-48df-8030-51ee628cb608/registry-server/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.506810 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-568c5665fb-9wsl7_ac32dab0-e793-4cbc-b363-a98a142aec89/manager/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.523701 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-qdvh5_0528fe97-ef52-441d-9e22-bf89676e6282/registry-server/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.653055 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd577c68-qswc4_b9f35401-32a2-47fd-b1d3-688724190542/manager/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.735480 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-jdh9x_46b07016-998e-4215-81bb-b2c71a8ccd82/registry-server/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.773382 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-j9lkp_e56ce5f8-ac4f-4e6c-86a5-bc6216f99e4f/operator/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.862273 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-5fnnd_38e7696e-97ac-4b38-9cd2-2e5e902aeb43/registry-server/0.log" Feb 17 15:52:43 crc kubenswrapper[4806]: I0217 15:52:43.950308 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-69cdff58cd-ggj55_c47438c6-0196-42f5-8f8f-bf5e9ed6df78/manager/0.log" Feb 17 15:52:44 crc kubenswrapper[4806]: I0217 15:52:44.035572 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-kdftz_4eb347b9-421b-4c66-97d5-1649602d2dd6/registry-server/0.log" Feb 17 15:52:56 crc kubenswrapper[4806]: I0217 15:52:56.160951 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:52:56 crc kubenswrapper[4806]: E0217 15:52:56.161792 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:52:59 crc kubenswrapper[4806]: I0217 15:52:59.806836 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w6lrr_4efc9c9c-8be8-41de-b524-dfb7dc45c3d0/control-plane-machine-set-operator/0.log" Feb 17 15:52:59 crc kubenswrapper[4806]: I0217 15:52:59.995871 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dhdkp_c8649291-1472-4f42-b8fe-447fa805d681/kube-rbac-proxy/0.log" Feb 17 15:53:00 crc kubenswrapper[4806]: I0217 15:53:00.051718 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-dhdkp_c8649291-1472-4f42-b8fe-447fa805d681/machine-api-operator/0.log" Feb 17 15:53:07 crc kubenswrapper[4806]: I0217 15:53:07.161585 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:53:07 crc kubenswrapper[4806]: E0217 15:53:07.162117 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:53:18 crc kubenswrapper[4806]: I0217 15:53:18.161684 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:53:18 crc kubenswrapper[4806]: E0217 15:53:18.162536 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:53:30 crc kubenswrapper[4806]: I0217 15:53:30.778864 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-fbsrm_a673b557-0484-42f5-b6ac-211f25330796/kube-rbac-proxy/0.log" Feb 17 15:53:30 crc kubenswrapper[4806]: I0217 15:53:30.807302 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-fbsrm_a673b557-0484-42f5-b6ac-211f25330796/controller/0.log" Feb 17 15:53:30 crc kubenswrapper[4806]: I0217 15:53:30.981971 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-frr-files/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.158908 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-frr-files/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.221984 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-metrics/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.259177 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-reloader/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.297159 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-reloader/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.433446 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-metrics/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.465292 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-frr-files/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.486136 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-reloader/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.509331 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-metrics/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.636865 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-frr-files/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.655021 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-reloader/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.712086 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/cp-metrics/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.714605 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/controller/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.851530 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/frr-metrics/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.946577 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/kube-rbac-proxy/0.log" Feb 17 15:53:31 crc kubenswrapper[4806]: I0217 15:53:31.955961 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/kube-rbac-proxy-frr/0.log" Feb 17 15:53:32 crc kubenswrapper[4806]: I0217 15:53:32.037389 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/reloader/0.log" Feb 17 15:53:32 crc kubenswrapper[4806]: I0217 15:53:32.190732 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-vvpxh_15bfb41c-c1ca-453a-ad2b-04e10d1ce059/frr-k8s-webhook-server/0.log" Feb 17 15:53:32 crc kubenswrapper[4806]: I0217 15:53:32.298582 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-m5dz4_ae60a61d-5eab-4c36-8cbd-412a743c2c87/frr/0.log" Feb 17 15:53:32 crc kubenswrapper[4806]: I0217 15:53:32.372730 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5bc4556c9c-hnh6t_e08b3bf4-2745-4fd2-8cfa-1de763d3a957/manager/0.log" Feb 17 15:53:32 crc kubenswrapper[4806]: I0217 15:53:32.435946 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-86df85fbff-5qgpr_1ad33e6b-3b94-4eab-aca2-8ff1b4e81a02/webhook-server/0.log" Feb 17 15:53:32 crc kubenswrapper[4806]: I0217 15:53:32.565477 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-84xmf_6191e3ef-dd3f-4905-aa2b-282df9ef96b8/kube-rbac-proxy/0.log" Feb 17 15:53:32 crc kubenswrapper[4806]: I0217 15:53:32.670549 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-84xmf_6191e3ef-dd3f-4905-aa2b-282df9ef96b8/speaker/0.log" Feb 17 15:53:33 crc kubenswrapper[4806]: I0217 15:53:33.161967 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:53:33 crc kubenswrapper[4806]: E0217 15:53:33.163342 4806 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jwndx_openshift-machine-config-operator(888ccee0-4c6b-45ea-9d8c-00668327ca0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" Feb 17 15:53:41 crc kubenswrapper[4806]: I0217 15:53:41.188026 4806 scope.go:117] "RemoveContainer" containerID="1ecfd55fb8088461780b69f495b7a12cb233e5f734d5cdca10f292818a840d6b" Feb 17 15:53:45 crc kubenswrapper[4806]: I0217 15:53:45.160718 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:53:46 crc kubenswrapper[4806]: I0217 15:53:46.399097 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"acfb975f61a284d380f47e0173383086a6dd931d732d9ceb8ab542e1fcddb814"} Feb 17 15:53:47 crc kubenswrapper[4806]: I0217 15:53:47.869957 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-2cd3-account-create-update-m69sk_42352269-0456-403d-8e34-af83a7c51d0b/mariadb-account-create-update/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.024563 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-create-2k645_49c1af8c-e873-4fb9-bf33-7870d77f2648/mariadb-database-create/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.030711 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-db-sync-9qrgm_f17db029-fc5e-47e0-a010-23beeb370f3f/glance-db-sync/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.214911 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-single-0_00ee37b0-a6e2-4bb0-94d6-cd3be23c533e/glance-log/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.264079 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-single-0_00ee37b0-a6e2-4bb0-94d6-cd3be23c533e/glance-httpd/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.285137 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-single-1_b9f6e316-f240-48a8-8f37-7e0fa651c469/glance-httpd/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.414268 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_glance-default-single-1_b9f6e316-f240-48a8-8f37-7e0fa651c469/glance-log/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.635142 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_keystone-75c54d45c8-njkpm_93a1bb0d-88da-450c-bea2-ced1b019457b/keystone-api/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.788199 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_64071ec4-119f-4213-9fc1-d7d9e665ca53/mysql-bootstrap/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.910444 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_64071ec4-119f-4213-9fc1-d7d9e665ca53/galera/0.log" Feb 17 15:53:48 crc kubenswrapper[4806]: I0217 15:53:48.928948 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-0_64071ec4-119f-4213-9fc1-d7d9e665ca53/mysql-bootstrap/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.159999 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_cec2c43b-a867-4933-ba16-78ec075c6671/mysql-bootstrap/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.284103 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_cec2c43b-a867-4933-ba16-78ec075c6671/mysql-bootstrap/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.341266 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-1_cec2c43b-a867-4933-ba16-78ec075c6671/galera/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.467051 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_f3d61fdf-8cbc-400a-ab38-7ee67a131849/mysql-bootstrap/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.722237 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_f3d61fdf-8cbc-400a-ab38-7ee67a131849/mysql-bootstrap/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.769033 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstack-galera-2_f3d61fdf-8cbc-400a-ab38-7ee67a131849/galera/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.792984 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_memcached-0_36964ae0-c931-4256-9a0d-55e56cf16b33/memcached/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.914075 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_7216f040-d3bb-4b04-a47f-c8e878cc6f1f/openstackclient/0.log" Feb 17 15:53:49 crc kubenswrapper[4806]: I0217 15:53:49.998936 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_e00f765b-c8c1-44b2-ad4a-9c17876f7ab4/setup-container/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.141758 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_e00f765b-c8c1-44b2-ad4a-9c17876f7ab4/rabbitmq/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.193878 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_rabbitmq-server-0_e00f765b-c8c1-44b2-ad4a-9c17876f7ab4/setup-container/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.288490 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-5f6df75b65-sh9ht_5813437e-d2ad-4742-8598-5d78f8026604/proxy-httpd/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.340892 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-proxy-5f6df75b65-sh9ht_5813437e-d2ad-4742-8598-5d78f8026604/proxy-server/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.432908 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-ring-rebalance-zl4m6_46887851-3f0c-4edf-ad3f-87602700b860/swift-ring-rebalance/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.557662 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/account-auditor/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.608949 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/account-reaper/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.624627 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/account-replicator/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.815165 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/account-server/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.871672 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/container-auditor/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.885699 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/container-replicator/0.log" Feb 17 15:53:50 crc kubenswrapper[4806]: I0217 15:53:50.977368 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/container-server/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.049195 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/container-updater/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.054962 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/object-auditor/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.084794 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/object-expirer/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.187314 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/object-replicator/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.207815 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/object-server/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.250591 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/rsync/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.253575 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/object-updater/0.log" Feb 17 15:53:51 crc kubenswrapper[4806]: I0217 15:53:51.344627 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_swift-storage-0_83699dfd-16c6-425d-b761-26b3635984ae/swift-recon-cron/0.log" Feb 17 15:54:05 crc kubenswrapper[4806]: I0217 15:54:05.516962 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x_42616f66-dce6-45b0-b11e-2802747e1212/util/0.log" Feb 17 15:54:05 crc kubenswrapper[4806]: I0217 15:54:05.706434 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x_42616f66-dce6-45b0-b11e-2802747e1212/util/0.log" Feb 17 15:54:05 crc kubenswrapper[4806]: I0217 15:54:05.734321 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x_42616f66-dce6-45b0-b11e-2802747e1212/pull/0.log" Feb 17 15:54:05 crc kubenswrapper[4806]: I0217 15:54:05.737123 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x_42616f66-dce6-45b0-b11e-2802747e1212/pull/0.log" Feb 17 15:54:05 crc kubenswrapper[4806]: I0217 15:54:05.833953 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x_42616f66-dce6-45b0-b11e-2802747e1212/util/0.log" Feb 17 15:54:05 crc kubenswrapper[4806]: I0217 15:54:05.868553 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x_42616f66-dce6-45b0-b11e-2802747e1212/pull/0.log" Feb 17 15:54:05 crc kubenswrapper[4806]: I0217 15:54:05.896568 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213thp2x_42616f66-dce6-45b0-b11e-2802747e1212/extract/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.019784 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zk9n_55d6f08f-1a64-42ba-8633-98e6b012ff7c/extract-utilities/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.196376 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zk9n_55d6f08f-1a64-42ba-8633-98e6b012ff7c/extract-content/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.203478 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zk9n_55d6f08f-1a64-42ba-8633-98e6b012ff7c/extract-utilities/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.217803 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zk9n_55d6f08f-1a64-42ba-8633-98e6b012ff7c/extract-content/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.359368 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zk9n_55d6f08f-1a64-42ba-8633-98e6b012ff7c/extract-utilities/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.371491 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zk9n_55d6f08f-1a64-42ba-8633-98e6b012ff7c/extract-content/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.597970 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5972_00aa8f35-6b46-4bf7-9676-1b2721bc8981/extract-utilities/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.796624 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zk9n_55d6f08f-1a64-42ba-8633-98e6b012ff7c/registry-server/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.818042 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5972_00aa8f35-6b46-4bf7-9676-1b2721bc8981/extract-content/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.851843 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5972_00aa8f35-6b46-4bf7-9676-1b2721bc8981/extract-utilities/0.log" Feb 17 15:54:06 crc kubenswrapper[4806]: I0217 15:54:06.861092 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5972_00aa8f35-6b46-4bf7-9676-1b2721bc8981/extract-content/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.007198 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5972_00aa8f35-6b46-4bf7-9676-1b2721bc8981/extract-utilities/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.027640 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5972_00aa8f35-6b46-4bf7-9676-1b2721bc8981/extract-content/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.230655 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gbz4l_7cae252d-6eec-4e1e-a829-9b11b21c4d75/marketplace-operator/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.251440 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vfjtk_faa62cd1-70a5-4d9b-9d84-44e8dead8ea5/extract-utilities/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.428805 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5972_00aa8f35-6b46-4bf7-9676-1b2721bc8981/registry-server/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.435528 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vfjtk_faa62cd1-70a5-4d9b-9d84-44e8dead8ea5/extract-utilities/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.470153 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vfjtk_faa62cd1-70a5-4d9b-9d84-44e8dead8ea5/extract-content/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.516353 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vfjtk_faa62cd1-70a5-4d9b-9d84-44e8dead8ea5/extract-content/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.693478 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vfjtk_faa62cd1-70a5-4d9b-9d84-44e8dead8ea5/extract-content/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.710178 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vfjtk_faa62cd1-70a5-4d9b-9d84-44e8dead8ea5/extract-utilities/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.803990 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vfjtk_faa62cd1-70a5-4d9b-9d84-44e8dead8ea5/registry-server/0.log" Feb 17 15:54:07 crc kubenswrapper[4806]: I0217 15:54:07.879141 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbfnh_5d5616f2-e471-4b7b-9434-e6e438a0cb5d/extract-utilities/0.log" Feb 17 15:54:08 crc kubenswrapper[4806]: I0217 15:54:08.078223 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbfnh_5d5616f2-e471-4b7b-9434-e6e438a0cb5d/extract-content/0.log" Feb 17 15:54:08 crc kubenswrapper[4806]: I0217 15:54:08.083578 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbfnh_5d5616f2-e471-4b7b-9434-e6e438a0cb5d/extract-content/0.log" Feb 17 15:54:08 crc kubenswrapper[4806]: I0217 15:54:08.087002 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbfnh_5d5616f2-e471-4b7b-9434-e6e438a0cb5d/extract-utilities/0.log" Feb 17 15:54:08 crc kubenswrapper[4806]: I0217 15:54:08.240767 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbfnh_5d5616f2-e471-4b7b-9434-e6e438a0cb5d/extract-content/0.log" Feb 17 15:54:08 crc kubenswrapper[4806]: I0217 15:54:08.253376 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbfnh_5d5616f2-e471-4b7b-9434-e6e438a0cb5d/extract-utilities/0.log" Feb 17 15:54:08 crc kubenswrapper[4806]: I0217 15:54:08.418201 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qbfnh_5d5616f2-e471-4b7b-9434-e6e438a0cb5d/registry-server/0.log" Feb 17 15:55:19 crc kubenswrapper[4806]: I0217 15:55:19.219770 4806 generic.go:334] "Generic (PLEG): container finished" podID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerID="66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4" exitCode=0 Feb 17 15:55:19 crc kubenswrapper[4806]: I0217 15:55:19.219927 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ck45l/must-gather-n8vp6" event={"ID":"9590bf64-6c80-4f52-b8bb-36801d9b0b3e","Type":"ContainerDied","Data":"66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4"} Feb 17 15:55:19 crc kubenswrapper[4806]: I0217 15:55:19.221108 4806 scope.go:117] "RemoveContainer" containerID="66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4" Feb 17 15:55:19 crc kubenswrapper[4806]: I0217 15:55:19.602897 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ck45l_must-gather-n8vp6_9590bf64-6c80-4f52-b8bb-36801d9b0b3e/gather/0.log" Feb 17 15:55:26 crc kubenswrapper[4806]: I0217 15:55:26.708526 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ck45l/must-gather-n8vp6"] Feb 17 15:55:26 crc kubenswrapper[4806]: I0217 15:55:26.709672 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ck45l/must-gather-n8vp6" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerName="copy" containerID="cri-o://67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559" gracePeriod=2 Feb 17 15:55:26 crc kubenswrapper[4806]: I0217 15:55:26.720235 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ck45l/must-gather-n8vp6"] Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.135525 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ck45l_must-gather-n8vp6_9590bf64-6c80-4f52-b8bb-36801d9b0b3e/copy/0.log" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.136536 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.154846 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twpbk\" (UniqueName: \"kubernetes.io/projected/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-kube-api-access-twpbk\") pod \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.155101 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-must-gather-output\") pod \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\" (UID: \"9590bf64-6c80-4f52-b8bb-36801d9b0b3e\") " Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.163362 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-kube-api-access-twpbk" (OuterVolumeSpecName: "kube-api-access-twpbk") pod "9590bf64-6c80-4f52-b8bb-36801d9b0b3e" (UID: "9590bf64-6c80-4f52-b8bb-36801d9b0b3e"). InnerVolumeSpecName "kube-api-access-twpbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.244794 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9590bf64-6c80-4f52-b8bb-36801d9b0b3e" (UID: "9590bf64-6c80-4f52-b8bb-36801d9b0b3e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.256990 4806 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.257021 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twpbk\" (UniqueName: \"kubernetes.io/projected/9590bf64-6c80-4f52-b8bb-36801d9b0b3e-kube-api-access-twpbk\") on node \"crc\" DevicePath \"\"" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.297529 4806 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ck45l_must-gather-n8vp6_9590bf64-6c80-4f52-b8bb-36801d9b0b3e/copy/0.log" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.298269 4806 generic.go:334] "Generic (PLEG): container finished" podID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerID="67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559" exitCode=143 Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.298342 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ck45l/must-gather-n8vp6" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.298354 4806 scope.go:117] "RemoveContainer" containerID="67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.341802 4806 scope.go:117] "RemoveContainer" containerID="66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.405361 4806 scope.go:117] "RemoveContainer" containerID="67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559" Feb 17 15:55:27 crc kubenswrapper[4806]: E0217 15:55:27.405874 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559\": container with ID starting with 67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559 not found: ID does not exist" containerID="67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.405973 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559"} err="failed to get container status \"67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559\": rpc error: code = NotFound desc = could not find container \"67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559\": container with ID starting with 67a663ed4a4ccfbdea5d9555f30a2f5407613c0eaef9b207e9c0973b3f0aa559 not found: ID does not exist" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.406062 4806 scope.go:117] "RemoveContainer" containerID="66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4" Feb 17 15:55:27 crc kubenswrapper[4806]: E0217 15:55:27.406355 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4\": container with ID starting with 66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4 not found: ID does not exist" containerID="66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4" Feb 17 15:55:27 crc kubenswrapper[4806]: I0217 15:55:27.406385 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4"} err="failed to get container status \"66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4\": rpc error: code = NotFound desc = could not find container \"66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4\": container with ID starting with 66ddc779919ed4f7219dc96258c7c61c282536b0e2766c6a25908b4883c49fa4 not found: ID does not exist" Feb 17 15:55:29 crc kubenswrapper[4806]: I0217 15:55:29.172694 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" path="/var/lib/kubelet/pods/9590bf64-6c80-4f52-b8bb-36801d9b0b3e/volumes" Feb 17 15:56:04 crc kubenswrapper[4806]: I0217 15:56:04.784304 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:56:04 crc kubenswrapper[4806]: I0217 15:56:04.786132 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:56:34 crc kubenswrapper[4806]: I0217 15:56:34.785159 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:56:34 crc kubenswrapper[4806]: I0217 15:56:34.785735 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.630048 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g2kl6"] Feb 17 15:56:37 crc kubenswrapper[4806]: E0217 15:56:37.630802 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="extract-content" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.630823 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="extract-content" Feb 17 15:56:37 crc kubenswrapper[4806]: E0217 15:56:37.630848 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="registry-server" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.630859 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="registry-server" Feb 17 15:56:37 crc kubenswrapper[4806]: E0217 15:56:37.630880 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerName="copy" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.630891 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerName="copy" Feb 17 15:56:37 crc kubenswrapper[4806]: E0217 15:56:37.630913 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="extract-utilities" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.630923 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="extract-utilities" Feb 17 15:56:37 crc kubenswrapper[4806]: E0217 15:56:37.630939 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerName="gather" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.630950 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerName="gather" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.631136 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerName="gather" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.631162 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ceefb74-3a06-4a8a-874a-abc0950a7df9" containerName="registry-server" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.631184 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9590bf64-6c80-4f52-b8bb-36801d9b0b3e" containerName="copy" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.632667 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.653952 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g2kl6"] Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.719683 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-utilities\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.719764 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-catalog-content\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.720065 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvgdj\" (UniqueName: \"kubernetes.io/projected/9970a2cb-a209-4a94-908c-34d52231c7d4-kube-api-access-wvgdj\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.822192 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvgdj\" (UniqueName: \"kubernetes.io/projected/9970a2cb-a209-4a94-908c-34d52231c7d4-kube-api-access-wvgdj\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.822339 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-utilities\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.822432 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-catalog-content\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.822955 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-catalog-content\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.823024 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-utilities\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.843968 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvgdj\" (UniqueName: \"kubernetes.io/projected/9970a2cb-a209-4a94-908c-34d52231c7d4-kube-api-access-wvgdj\") pod \"certified-operators-g2kl6\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:37 crc kubenswrapper[4806]: I0217 15:56:37.958138 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:38 crc kubenswrapper[4806]: I0217 15:56:38.435602 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g2kl6"] Feb 17 15:56:38 crc kubenswrapper[4806]: I0217 15:56:38.954485 4806 generic.go:334] "Generic (PLEG): container finished" podID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerID="c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9" exitCode=0 Feb 17 15:56:38 crc kubenswrapper[4806]: I0217 15:56:38.954565 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2kl6" event={"ID":"9970a2cb-a209-4a94-908c-34d52231c7d4","Type":"ContainerDied","Data":"c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9"} Feb 17 15:56:38 crc kubenswrapper[4806]: I0217 15:56:38.954836 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2kl6" event={"ID":"9970a2cb-a209-4a94-908c-34d52231c7d4","Type":"ContainerStarted","Data":"59aea424ff93aac05dd9d0041145a7def3a2cad8f178724ec5baa858279369cf"} Feb 17 15:56:39 crc kubenswrapper[4806]: I0217 15:56:39.967637 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2kl6" event={"ID":"9970a2cb-a209-4a94-908c-34d52231c7d4","Type":"ContainerStarted","Data":"a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8"} Feb 17 15:56:40 crc kubenswrapper[4806]: I0217 15:56:40.981519 4806 generic.go:334] "Generic (PLEG): container finished" podID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerID="a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8" exitCode=0 Feb 17 15:56:40 crc kubenswrapper[4806]: I0217 15:56:40.981633 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2kl6" event={"ID":"9970a2cb-a209-4a94-908c-34d52231c7d4","Type":"ContainerDied","Data":"a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8"} Feb 17 15:56:41 crc kubenswrapper[4806]: I0217 15:56:41.996908 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2kl6" event={"ID":"9970a2cb-a209-4a94-908c-34d52231c7d4","Type":"ContainerStarted","Data":"b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b"} Feb 17 15:56:42 crc kubenswrapper[4806]: I0217 15:56:42.033247 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g2kl6" podStartSLOduration=2.511336387 podStartE2EDuration="5.033221397s" podCreationTimestamp="2026-02-17 15:56:37 +0000 UTC" firstStartedPulling="2026-02-17 15:56:38.957334954 +0000 UTC m=+2160.487965365" lastFinishedPulling="2026-02-17 15:56:41.479219934 +0000 UTC m=+2163.009850375" observedRunningTime="2026-02-17 15:56:42.023001182 +0000 UTC m=+2163.553631663" watchObservedRunningTime="2026-02-17 15:56:42.033221397 +0000 UTC m=+2163.563851848" Feb 17 15:56:47 crc kubenswrapper[4806]: I0217 15:56:47.958725 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:47 crc kubenswrapper[4806]: I0217 15:56:47.959628 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:48 crc kubenswrapper[4806]: I0217 15:56:48.004754 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:48 crc kubenswrapper[4806]: I0217 15:56:48.127639 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:48 crc kubenswrapper[4806]: I0217 15:56:48.260732 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g2kl6"] Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.074615 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g2kl6" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="registry-server" containerID="cri-o://b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b" gracePeriod=2 Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.567584 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.742895 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-utilities\") pod \"9970a2cb-a209-4a94-908c-34d52231c7d4\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.742987 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvgdj\" (UniqueName: \"kubernetes.io/projected/9970a2cb-a209-4a94-908c-34d52231c7d4-kube-api-access-wvgdj\") pod \"9970a2cb-a209-4a94-908c-34d52231c7d4\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.743054 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-catalog-content\") pod \"9970a2cb-a209-4a94-908c-34d52231c7d4\" (UID: \"9970a2cb-a209-4a94-908c-34d52231c7d4\") " Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.744498 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-utilities" (OuterVolumeSpecName: "utilities") pod "9970a2cb-a209-4a94-908c-34d52231c7d4" (UID: "9970a2cb-a209-4a94-908c-34d52231c7d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.761625 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9970a2cb-a209-4a94-908c-34d52231c7d4-kube-api-access-wvgdj" (OuterVolumeSpecName: "kube-api-access-wvgdj") pod "9970a2cb-a209-4a94-908c-34d52231c7d4" (UID: "9970a2cb-a209-4a94-908c-34d52231c7d4"). InnerVolumeSpecName "kube-api-access-wvgdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.814945 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9970a2cb-a209-4a94-908c-34d52231c7d4" (UID: "9970a2cb-a209-4a94-908c-34d52231c7d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.845355 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.845427 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvgdj\" (UniqueName: \"kubernetes.io/projected/9970a2cb-a209-4a94-908c-34d52231c7d4-kube-api-access-wvgdj\") on node \"crc\" DevicePath \"\"" Feb 17 15:56:50 crc kubenswrapper[4806]: I0217 15:56:50.845450 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9970a2cb-a209-4a94-908c-34d52231c7d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.088465 4806 generic.go:334] "Generic (PLEG): container finished" podID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerID="b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b" exitCode=0 Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.088524 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g2kl6" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.088528 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2kl6" event={"ID":"9970a2cb-a209-4a94-908c-34d52231c7d4","Type":"ContainerDied","Data":"b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b"} Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.088774 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g2kl6" event={"ID":"9970a2cb-a209-4a94-908c-34d52231c7d4","Type":"ContainerDied","Data":"59aea424ff93aac05dd9d0041145a7def3a2cad8f178724ec5baa858279369cf"} Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.088832 4806 scope.go:117] "RemoveContainer" containerID="b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.122598 4806 scope.go:117] "RemoveContainer" containerID="a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.193551 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g2kl6"] Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.194720 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g2kl6"] Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.212789 4806 scope.go:117] "RemoveContainer" containerID="c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.232646 4806 scope.go:117] "RemoveContainer" containerID="b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b" Feb 17 15:56:51 crc kubenswrapper[4806]: E0217 15:56:51.233486 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b\": container with ID starting with b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b not found: ID does not exist" containerID="b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.233516 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b"} err="failed to get container status \"b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b\": rpc error: code = NotFound desc = could not find container \"b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b\": container with ID starting with b257f253d63501cad2595964aee6f99e677935b2e217d3d04a23a0e345f9b01b not found: ID does not exist" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.233539 4806 scope.go:117] "RemoveContainer" containerID="a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8" Feb 17 15:56:51 crc kubenswrapper[4806]: E0217 15:56:51.233965 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8\": container with ID starting with a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8 not found: ID does not exist" containerID="a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.233989 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8"} err="failed to get container status \"a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8\": rpc error: code = NotFound desc = could not find container \"a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8\": container with ID starting with a15adea443997b5fc0dcd097321bfcd045058feaffe3dc2fd940fb1640e497e8 not found: ID does not exist" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.234010 4806 scope.go:117] "RemoveContainer" containerID="c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9" Feb 17 15:56:51 crc kubenswrapper[4806]: E0217 15:56:51.234537 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9\": container with ID starting with c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9 not found: ID does not exist" containerID="c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9" Feb 17 15:56:51 crc kubenswrapper[4806]: I0217 15:56:51.234615 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9"} err="failed to get container status \"c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9\": rpc error: code = NotFound desc = could not find container \"c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9\": container with ID starting with c0f974dd16cd856ea4122e77d1cde90f359f210c784bccf086462d35c6796bc9 not found: ID does not exist" Feb 17 15:56:53 crc kubenswrapper[4806]: I0217 15:56:53.174918 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" path="/var/lib/kubelet/pods/9970a2cb-a209-4a94-908c-34d52231c7d4/volumes" Feb 17 15:57:04 crc kubenswrapper[4806]: I0217 15:57:04.784482 4806 patch_prober.go:28] interesting pod/machine-config-daemon-jwndx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 15:57:04 crc kubenswrapper[4806]: I0217 15:57:04.785091 4806 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 15:57:04 crc kubenswrapper[4806]: I0217 15:57:04.785190 4806 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" Feb 17 15:57:04 crc kubenswrapper[4806]: I0217 15:57:04.786015 4806 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"acfb975f61a284d380f47e0173383086a6dd931d732d9ceb8ab542e1fcddb814"} pod="openshift-machine-config-operator/machine-config-daemon-jwndx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 15:57:04 crc kubenswrapper[4806]: I0217 15:57:04.786120 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" podUID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerName="machine-config-daemon" containerID="cri-o://acfb975f61a284d380f47e0173383086a6dd931d732d9ceb8ab542e1fcddb814" gracePeriod=600 Feb 17 15:57:05 crc kubenswrapper[4806]: I0217 15:57:05.367371 4806 generic.go:334] "Generic (PLEG): container finished" podID="888ccee0-4c6b-45ea-9d8c-00668327ca0d" containerID="acfb975f61a284d380f47e0173383086a6dd931d732d9ceb8ab542e1fcddb814" exitCode=0 Feb 17 15:57:05 crc kubenswrapper[4806]: I0217 15:57:05.368192 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerDied","Data":"acfb975f61a284d380f47e0173383086a6dd931d732d9ceb8ab542e1fcddb814"} Feb 17 15:57:05 crc kubenswrapper[4806]: I0217 15:57:05.368223 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jwndx" event={"ID":"888ccee0-4c6b-45ea-9d8c-00668327ca0d","Type":"ContainerStarted","Data":"67ecd20863c7c138f120a68a1328a304c3b5cbddd358b76e37079518b1da21cb"} Feb 17 15:57:05 crc kubenswrapper[4806]: I0217 15:57:05.368381 4806 scope.go:117] "RemoveContainer" containerID="184ba6ac6947dd64d3a3c0d99bf9d4a2e3f539118a9158d2152b5987893d149b" Feb 17 15:57:32 crc kubenswrapper[4806]: I0217 15:57:32.038782 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-2cd3-account-create-update-m69sk"] Feb 17 15:57:32 crc kubenswrapper[4806]: I0217 15:57:32.053080 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-2k645"] Feb 17 15:57:32 crc kubenswrapper[4806]: I0217 15:57:32.063919 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-2cd3-account-create-update-m69sk"] Feb 17 15:57:32 crc kubenswrapper[4806]: I0217 15:57:32.075205 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-2k645"] Feb 17 15:57:33 crc kubenswrapper[4806]: I0217 15:57:33.177699 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42352269-0456-403d-8e34-af83a7c51d0b" path="/var/lib/kubelet/pods/42352269-0456-403d-8e34-af83a7c51d0b/volumes" Feb 17 15:57:33 crc kubenswrapper[4806]: I0217 15:57:33.179150 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c1af8c-e873-4fb9-bf33-7870d77f2648" path="/var/lib/kubelet/pods/49c1af8c-e873-4fb9-bf33-7870d77f2648/volumes" Feb 17 15:57:40 crc kubenswrapper[4806]: I0217 15:57:40.031920 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-9qrgm"] Feb 17 15:57:40 crc kubenswrapper[4806]: I0217 15:57:40.044994 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-9qrgm"] Feb 17 15:57:41 crc kubenswrapper[4806]: I0217 15:57:41.177982 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17db029-fc5e-47e0-a010-23beeb370f3f" path="/var/lib/kubelet/pods/f17db029-fc5e-47e0-a010-23beeb370f3f/volumes" Feb 17 15:57:41 crc kubenswrapper[4806]: I0217 15:57:41.309585 4806 scope.go:117] "RemoveContainer" containerID="a56371cfcab01cc152e17a9e82a81cf66fb0b96695477a3c7960d0e9ef30f3de" Feb 17 15:57:41 crc kubenswrapper[4806]: I0217 15:57:41.385508 4806 scope.go:117] "RemoveContainer" containerID="668d27e30ecf98055014a3d366d58d9090f8a08c5e498c67e0d7a1cff38575d3" Feb 17 15:57:41 crc kubenswrapper[4806]: I0217 15:57:41.416707 4806 scope.go:117] "RemoveContainer" containerID="5e3a6a543eeff88bf244f271b055da6a0345de68d9d120c0ae04a1e81f60cbb6" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.115339 4806 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r4hb9"] Feb 17 15:57:47 crc kubenswrapper[4806]: E0217 15:57:47.116375 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="extract-utilities" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.116529 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="extract-utilities" Feb 17 15:57:47 crc kubenswrapper[4806]: E0217 15:57:47.116545 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="registry-server" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.116556 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="registry-server" Feb 17 15:57:47 crc kubenswrapper[4806]: E0217 15:57:47.116589 4806 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="extract-content" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.116601 4806 state_mem.go:107] "Deleted CPUSet assignment" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="extract-content" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.116810 4806 memory_manager.go:354] "RemoveStaleState removing state" podUID="9970a2cb-a209-4a94-908c-34d52231c7d4" containerName="registry-server" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.118056 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.136247 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4hb9"] Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.147638 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl9d9\" (UniqueName: \"kubernetes.io/projected/943a7574-b55b-4acf-a432-0a5d1dde6b0b-kube-api-access-hl9d9\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.147736 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-catalog-content\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.147773 4806 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-utilities\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.249045 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl9d9\" (UniqueName: \"kubernetes.io/projected/943a7574-b55b-4acf-a432-0a5d1dde6b0b-kube-api-access-hl9d9\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.249150 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-catalog-content\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.249181 4806 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-utilities\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.250086 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-utilities\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.250955 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-catalog-content\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.275044 4806 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl9d9\" (UniqueName: \"kubernetes.io/projected/943a7574-b55b-4acf-a432-0a5d1dde6b0b-kube-api-access-hl9d9\") pod \"redhat-marketplace-r4hb9\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.448737 4806 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.672741 4806 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4hb9"] Feb 17 15:57:47 crc kubenswrapper[4806]: I0217 15:57:47.765182 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4hb9" event={"ID":"943a7574-b55b-4acf-a432-0a5d1dde6b0b","Type":"ContainerStarted","Data":"85b967fdbac3ceb976d9abdcc41a8754e4a79b7187c67e15e9f2d58be8330a62"} Feb 17 15:57:48 crc kubenswrapper[4806]: I0217 15:57:48.778585 4806 generic.go:334] "Generic (PLEG): container finished" podID="943a7574-b55b-4acf-a432-0a5d1dde6b0b" containerID="cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab" exitCode=0 Feb 17 15:57:48 crc kubenswrapper[4806]: I0217 15:57:48.778640 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4hb9" event={"ID":"943a7574-b55b-4acf-a432-0a5d1dde6b0b","Type":"ContainerDied","Data":"cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab"} Feb 17 15:57:48 crc kubenswrapper[4806]: I0217 15:57:48.784909 4806 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 15:57:50 crc kubenswrapper[4806]: I0217 15:57:50.801129 4806 generic.go:334] "Generic (PLEG): container finished" podID="943a7574-b55b-4acf-a432-0a5d1dde6b0b" containerID="61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f" exitCode=0 Feb 17 15:57:50 crc kubenswrapper[4806]: I0217 15:57:50.801468 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4hb9" event={"ID":"943a7574-b55b-4acf-a432-0a5d1dde6b0b","Type":"ContainerDied","Data":"61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f"} Feb 17 15:57:51 crc kubenswrapper[4806]: I0217 15:57:51.811275 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4hb9" event={"ID":"943a7574-b55b-4acf-a432-0a5d1dde6b0b","Type":"ContainerStarted","Data":"db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4"} Feb 17 15:57:57 crc kubenswrapper[4806]: I0217 15:57:57.449553 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:57 crc kubenswrapper[4806]: I0217 15:57:57.449985 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:57 crc kubenswrapper[4806]: I0217 15:57:57.525769 4806 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:57 crc kubenswrapper[4806]: I0217 15:57:57.560756 4806 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r4hb9" podStartSLOduration=7.842202557 podStartE2EDuration="10.560728297s" podCreationTimestamp="2026-02-17 15:57:47 +0000 UTC" firstStartedPulling="2026-02-17 15:57:48.784526666 +0000 UTC m=+2230.315157107" lastFinishedPulling="2026-02-17 15:57:51.503052426 +0000 UTC m=+2233.033682847" observedRunningTime="2026-02-17 15:57:51.846962184 +0000 UTC m=+2233.377592665" watchObservedRunningTime="2026-02-17 15:57:57.560728297 +0000 UTC m=+2239.091358748" Feb 17 15:57:57 crc kubenswrapper[4806]: I0217 15:57:57.917579 4806 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:57:57 crc kubenswrapper[4806]: I0217 15:57:57.969253 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4hb9"] Feb 17 15:57:59 crc kubenswrapper[4806]: I0217 15:57:59.881446 4806 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r4hb9" podUID="943a7574-b55b-4acf-a432-0a5d1dde6b0b" containerName="registry-server" containerID="cri-o://db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4" gracePeriod=2 Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.378684 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.472425 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-catalog-content\") pod \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.472566 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl9d9\" (UniqueName: \"kubernetes.io/projected/943a7574-b55b-4acf-a432-0a5d1dde6b0b-kube-api-access-hl9d9\") pod \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.472688 4806 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-utilities\") pod \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\" (UID: \"943a7574-b55b-4acf-a432-0a5d1dde6b0b\") " Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.474297 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-utilities" (OuterVolumeSpecName: "utilities") pod "943a7574-b55b-4acf-a432-0a5d1dde6b0b" (UID: "943a7574-b55b-4acf-a432-0a5d1dde6b0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.483179 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943a7574-b55b-4acf-a432-0a5d1dde6b0b-kube-api-access-hl9d9" (OuterVolumeSpecName: "kube-api-access-hl9d9") pod "943a7574-b55b-4acf-a432-0a5d1dde6b0b" (UID: "943a7574-b55b-4acf-a432-0a5d1dde6b0b"). InnerVolumeSpecName "kube-api-access-hl9d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.575180 4806 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl9d9\" (UniqueName: \"kubernetes.io/projected/943a7574-b55b-4acf-a432-0a5d1dde6b0b-kube-api-access-hl9d9\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.575583 4806 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.621883 4806 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "943a7574-b55b-4acf-a432-0a5d1dde6b0b" (UID: "943a7574-b55b-4acf-a432-0a5d1dde6b0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.677070 4806 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943a7574-b55b-4acf-a432-0a5d1dde6b0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.892569 4806 generic.go:334] "Generic (PLEG): container finished" podID="943a7574-b55b-4acf-a432-0a5d1dde6b0b" containerID="db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4" exitCode=0 Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.892588 4806 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4hb9" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.892607 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4hb9" event={"ID":"943a7574-b55b-4acf-a432-0a5d1dde6b0b","Type":"ContainerDied","Data":"db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4"} Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.893181 4806 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4hb9" event={"ID":"943a7574-b55b-4acf-a432-0a5d1dde6b0b","Type":"ContainerDied","Data":"85b967fdbac3ceb976d9abdcc41a8754e4a79b7187c67e15e9f2d58be8330a62"} Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.893205 4806 scope.go:117] "RemoveContainer" containerID="db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.936962 4806 scope.go:117] "RemoveContainer" containerID="61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f" Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.963550 4806 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4hb9"] Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.973227 4806 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4hb9"] Feb 17 15:58:00 crc kubenswrapper[4806]: I0217 15:58:00.977641 4806 scope.go:117] "RemoveContainer" containerID="cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab" Feb 17 15:58:01 crc kubenswrapper[4806]: I0217 15:58:01.005534 4806 scope.go:117] "RemoveContainer" containerID="db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4" Feb 17 15:58:01 crc kubenswrapper[4806]: E0217 15:58:01.006004 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4\": container with ID starting with db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4 not found: ID does not exist" containerID="db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4" Feb 17 15:58:01 crc kubenswrapper[4806]: I0217 15:58:01.006077 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4"} err="failed to get container status \"db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4\": rpc error: code = NotFound desc = could not find container \"db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4\": container with ID starting with db1daca12eb7658a7e14a748a80f957c9bdad13d78a57c227474b3e5e7e69ce4 not found: ID does not exist" Feb 17 15:58:01 crc kubenswrapper[4806]: I0217 15:58:01.006122 4806 scope.go:117] "RemoveContainer" containerID="61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f" Feb 17 15:58:01 crc kubenswrapper[4806]: E0217 15:58:01.006641 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f\": container with ID starting with 61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f not found: ID does not exist" containerID="61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f" Feb 17 15:58:01 crc kubenswrapper[4806]: I0217 15:58:01.006689 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f"} err="failed to get container status \"61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f\": rpc error: code = NotFound desc = could not find container \"61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f\": container with ID starting with 61fe8f186ee90ebf14c9aac5ce40cef025049ab25d8d6b58482c24cd184f661f not found: ID does not exist" Feb 17 15:58:01 crc kubenswrapper[4806]: I0217 15:58:01.006725 4806 scope.go:117] "RemoveContainer" containerID="cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab" Feb 17 15:58:01 crc kubenswrapper[4806]: E0217 15:58:01.007071 4806 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab\": container with ID starting with cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab not found: ID does not exist" containerID="cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab" Feb 17 15:58:01 crc kubenswrapper[4806]: I0217 15:58:01.007130 4806 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab"} err="failed to get container status \"cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab\": rpc error: code = NotFound desc = could not find container \"cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab\": container with ID starting with cb4080e2a102c9055ae72f23484c03fc9b51d6290c7d3e76df7872be94a708ab not found: ID does not exist" Feb 17 15:58:01 crc kubenswrapper[4806]: I0217 15:58:01.176780 4806 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943a7574-b55b-4acf-a432-0a5d1dde6b0b" path="/var/lib/kubelet/pods/943a7574-b55b-4acf-a432-0a5d1dde6b0b/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145110050024435 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145110050017352 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145103250016501 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145103250015451 5ustar corecore